THER INEVITABILITY OF UBI…:

The white-collar class derided mass layoffs among the blue-collar workers. It’s about to feel their pain (Glenn H. Reynolds, Jan. 16th, 2024, NY Post)

[T]he worm has turned. Google is looking at laying off 30,000 people it expects to replace with artificial intelligence.

The Wall Street Journal reports that large corporations across the board are planning to lay off white-collar workers.

Investor Brian Wang notes ChatGPT is already causing white-collar job loss.

In fact, ChatGPT can even code.

Sometimes its code is quite good. Sometimes it’s not so good.

(Though God knows, the latter is true of much human-generated software code too.)

It can write press releases, ad copy, catalog descriptions, news stories and essays, speeches, encyclopedia entries, customer-inquiry responses and more.

It can generate art on demand that’s suitable for book covers, advertisements and magazine illustrations.

Again, sometimes these items are quite good, and sometimes they’re not, but there’s a lot of less-than-stellar human work in those categories too.

Learning to code is bad advice now.

And the kicker is, AI is getting better all the time.

ChatGPT-4 has demonstrated “human-level performance” on many benchmarks.

It can pass bar exams, diagnose disease and process images and text. The improvement since ChatGPT-3.5 is significant.

People, on the other hand, are staying pretty much the same.

The bad news for the symbolic analysts is they’re playing on AI’s turf.

When you deal with ideas and data and symbols, you’re working with bits, and AI is pretty good at working with bits.

People losing their jobs to AI is just the tip of the iceberg.

In the next decade, lots more people — possibly (gulp) including professors like me — will be facing potential replacement by machines.

It turns out that using your brain and not your hands isn’t as good a move as it may have once seemed.

…is a function of the fact that “we” are going to not have jobs, not just “them”.

“ALL MEN”

Misunderstanding antisemitism in America (Musa al-Gharbi, 1/11/24, Slow Boring)


Contrary to widespread narratives, students do not internalize the views of their professors very often. Young people’s attitudes tend to be fairly stable throughout their college careers, and the limited change that occurs seems to be driven much more by peers than professors.

And far from pushing politics in the classroom, surveys suggest that more than 80 percent of scholars who work on Middle East issues self-censor on the topic of Israel and Palestine. Overwhelmingly, this self-censorship entails refraining from criticism of Israel, typically out of fear of retaliation by external stakeholders, university administrators or student mobs.

Moreover, rather than education pushing people to hold antisemitic or anti-Israel views, college attendance and completion are inversely correlated with antisemitism. The overwhelming majority of college graduates embrace one or fewer of the Anti Defamation League (ADL)’s fourteen antisemitic attitudes. And even people who just attended some college but didn’t graduate tend to be significantly less antisemitic than those who didn’t go to college at all:

Higher education also corresponds to greater knowledge about the Holocaust and lowered propensity to engage in Holocaust denial.

And although this question is importantly distinct from antisemitism per se, the more college Americans get, the more likely they become to express positive views of Israel (and the less likely they become to view Israel unfavorably).

Why are so many people convinced that the opposite is true?

In part, it’s because, as has chronically been the case in “campus culture war” discourse, narratives about colleges and universities after October 7 have been driven heavily by sensationalized events at a small number of elite schools whose culture, policies and students are deeply unrepresentative of higher ed writ large.

Exacerbating this problem, many inappropriately conflate trends among young people as a whole with trends among college students in particular and then inappropriately blame institutions of higher learning and “radical professors” for trends that are common among young people writ large, even those that did not attend college.

The widespread tendency to conflate opposition to Zionism, criticism of the Israeli government, or support for the Palestinian cause with antisemitism reinforces these misperceptions.

There’s nothing more American than the insistence on universal self-determination.

DENYING CONSENSUAL GOVERNANCE:

The Constitution’s Overlooked Road Map for an Accountable Bureaucracy (Alison Somin, 1/18/24, Discourse)

Today there are hundreds, if not thousands, of officials in the federal government who exercise expansive power who are not confirmed by the Senate, are not accountable to the president, or both. To fix this broken system, it’s necessary to revitalize the president’s powers to appoint and remove executive officials.


In the late 19th and early 20th centuries, the progressive movement grew increasingly critical of the original constitutional design. The progressives wanted to move power away from the democratically elected president and direct appointees into the hands of supposedly impartial, nonpolitical experts.

Their moment came in the 1930s, when the crisis of the Great Depression led to demand for extraordinary measures. Congress created a slew of new executive agencies and made it impossible for the president to fire many of the officials who populated those agencies except for cause. And over the ensuing decades, as these agencies pushed the bounds of their own power, decision-making power accumulated with officials who were never constitutionally appointed.

Early progressives and contemporary defenders of the administrative state have defended removal protections for federal officials because they allow those officials to be “insulated from politics.” But put another way, this is ultimately an attempt to wrest the levers of government power away from the people. It’s incompatible with the Constitution’s promise of self-government, the beating heart of the American experiment. The people deserve the government they choose, whether it comports with the preferences of the “experts” or not.

There is value to having the executive branch staffed by experts with technical knowledge. But technical knowledge is only one part of the puzzle that is policymaking. Values also matter, and the ability to make tradeoffs among competing values is one of the most important parts of governing. Those tradeoffs must be made by the people’s representatives or, at the very least, officials who are directly accountable to them.

Unlike the intentional spread of removal protections, the plethora of federal officials who wield government power without being vetted by the Senate developed as much by default as by design. The New Deal and the Great Society vastly expanded the footprint of government interference in the lives of everyday Americans. All those rules and enforcement actions overwhelmed the capacity of officials who had been appointed by the president and confirmed by the Senate. Rather than appointing more of these officials, the executive branch devolved lots of power to employees who were never appointed in an accountable manner.


Regulatory overreach by officials who are not constitutionally appointed appears to be all too common. One Pacific Legal Foundation study found that 71% of rules issued by the Department of Health and Human services were unconstitutional because the officer signing them was never appointed by the president and confirmed by the Senate.