Boom Towns, USA


The dominant housing story of the last century was an exodus of those with means from cities to suburbs.  The American Dream consisted of a white-picket fence around a private yard, 2.4 children in the home, and a nice car or two.  Today, the Dream is changing.  Sure, the suburbs still offer a great deal, but there’s a powerful countertrend that is increasingly hard to ignore: a renaissance in cities, as they draw empty nesters and young professionals alike to a vibrant urban lifestyle.

According to, over 250,000 people reach retirement age in the United States every month.  And although the share of baby boomers living in urban areas has decreased since 2000 on an aggregate basis, an important subset of empty nesters is flocking back to choice American cities.

Take Boston, for example.  According to, the northeastern city is the most in-demand urban destination for buyers between the ages of 65 and 74, a range that includes the oldest five years of boomers.  That demographic is buying more homes overall than every segment besides Millennials and Generation X, and Beantown is their first choice among cities.  Some former suburbanites have even formed ‘expat’ groups, according to the Boston Globe. Incidentally, this dynamic risks pushing suburban real estate prices down as the number of homes for sale rises.  Indeed, in 2016, suburban home prices have been weak, while Boston real estate values have surged.


But it’s not just about retirees flocking to cities.  Young professionals are also increasingly working and living downtown, drawn by exciting employment opportunities.  Between 2010 and 2014, half of new business growth in the US came from 20 urban counties, and half of all job growth in that period came from 73 counties.  And according to a recent Wall Street Journal story, one factor behind the Beantown boom has been an influx of people choosing to live in the city.  Between 2010 and 2014, for instance, the population of Boston grew by 6%, double the national rate.

Many of America’s best-performing cities are, unsurprisingly, science and tech hubs: the usual suspects like San Francisco, New York, and Seattle, but also lesser known ones like Raleigh.  It’s not just science and tech, though—just look to the Lone Star State.  Despite the drop in oil prices, Austin and Dallas have enjoyed resilient economies fueled not just by high-tech industry, but also festivals, logistics, financial services, and a general business-friendly environment.


This urban renaissance has generated a strong real estate market in America’s boom towns.  Rents have risen and prices are increasing.  In 2011, high-end urban apartment rent growth peaked at 8% per year and remains well north of inflation. Rental increases are higher for large, in-demand cities like New York, Boston, and San Francisco—and even higher for the most expensive units within these markets.

I worry about the sustainability of these dynamics, in large part because when markets work well, higher prices stimulate supply.  And that’s exactly what we’re seeing in many cities.  According to the Wall Street Journal, “In 25 of the largest U.S. cities, multifamily permits in urban areas were up 39% in 2015 compared with a year earlier.”  Major cities like New York, Philadelphia, and Boston are all expecting housing supply growth between 2 and 3 times the historical average in the next year.  Although demand in these cities is robust, it’s nevertheless worth watching to see if today’s boom turns into tomorrow’s bust.

When markets work well, higher prices stimulate supply. And that's exactly what we're seeing in many cities.

The most recent issue of Worth magazine highlighted some of the cities that have benefited from—and actively helped to stimulate—the urban boom.  Worth’s selection of dynamic American cities includes San Diego, Dallas, Charleston, Nashville, New Orleans, San Francisco, and New York, among others.  What unites these cities? As Worth editor Richard Bradley summarized, successful cities have used effective public policy to make downtowns both livable and business-friendly, while embracing existing assets—like Nashville’s music scene or San Diego’s science infrastructure.

Yet the cities that Worth profiled are not homogenous.  They’re each vibrant in their own ways.  San Diego, for example, has ample human capital, support for startups, and a commitment to infrastructure investment.  Dallas has transformed into a booming cultural center.  Charleston, home to an important port, has attracted global manufacturers without losing its historic charm.  ToWorth’s list I’d add Boston, whose Seaport District alone has managed to attract everything from tech startups to General Electric in recent years.

Unfortunately, an influx of wealthy young people tends to make attractive cities like these less affordable.  So what’s the best way to sustain the urban renaissance?  New construction—via relaxed zoning restrictions—may be a partial solution. For this reason, as Matthew Yglesias put it, “the elevator could be the next great disruptive technology.”  Increasing density can push down prices without generating sprawl.


But just as technology could bolster the urban renaissance, it could also endanger it.  The Wall Street Journal’s Christopher Mims suggests that the rise of self-driving cars might take the new urban enthusiasts to the suburbs.  It’s not hard to imagine young affluent Millennials being wooed by futuristic vehicles conveniently escorting them to and from the spacious suburbs.

For now, as noted by Worth’s Bradley, “we are living in a golden era of American cities.”  And the trends don’t show signs of reversing.  Boston, for instance, isexpecting 90,000 new residents in the next 14 years.  Especially in this time of general global instability, with Brexit, choppy asset markets, and falling commodities prices increasing fear across the board, there are some things worth celebrating.  The flourishing of our cities is one of them.

Hyperinflation in America: The End of Grades?


Grade inflation—no, hyperinflation—is running rampant in American higher education.  At Yale, where I have been both a student and an instructor, the average GPA has risen considerably over the past 50 years. And my alma mater is not unique.

A recent study revealed that 42% of four-year college grades are A’s, and 77% are either A’s or B’s. According to Inside Higher Ed, “At four-year schools, awarding of A's has been going up five to six percentage points per decade and A's are now three times more common than they were in 1960.” At Yale, 62% of grades were in the A range in the spring of 2012. That figure was only 10% in 1963.

“A’s are now three times more common than they were in 1960.”

Back then, a C was the grade college students received most frequently. Later in the decade, the B took its place. Professors boosted students’ grades in part because, if the kids did too poorly, they could be shipped off to the war in Vietnam. The B reigned supreme till the 1990s, when the A claimed the crown. It's been strengthening its lead ever since.

What caused the most recent bout of grade inflation? Some point to the rise in tuition—and educational debt—driving students to expect to be treated as consumers, or teachers boosting grades in exchange for better course reviews. Catherine Rampell of the Washington Post has suggested this dynamic has led to a “GPA arms race.” Others argue that student bodies have simply gotten smarter, but by a variety of metrics, this is not the case: standardized test scores and graduate literacy have not improved as grades have risen.

Ironically, attempts to combat grade inflation don’t get a passing grade. In the early 2000s, Wellesley College implemented a policy setting a B+ or lower as the mean grade in many classes. As a result, the average GPA fell from 3.55 to 3.28 over a couple of years. Since then, it’s been slowly drifting back up.

Attempts to combat grade inflation don’t get a passing grade.

Princeton had a similar experience, as it too attempted to combat grade inflation in the early part of the 21st century. It urged departments to award A’s for no more than 35% of course grades. But by 2014, it ended its decade-old grade deflation policy, citing the unnecessary stress it created on students. The next year, grades rose back to their 2002 levels.

Grade inflation is a classic collective action problem. Even if individual teachers want to fight back, they risk harming their students arbitrarily in the process. A single bad grade can set students apart when bad grades are hard to come by.

A recent op-ed by an Emory junior, who hopes to go to medical school, described the tears and desperation that came with receiving her first B+. She’s not alone. I've seen a similar dynamic at Yale with students applying to law school.

With hyperinflation, prices can spiral up indefinitely.  Unlike economic price levels, however, grades have an upper bound. For this reason, when they inflate, they all bunch into the A-range. Dispersion decreases, and grades are increasingly less useful at distinguishing between excellent and average student performance. The signaling value of an A is undermined if A’s are ubiquitous.


But is grade inflation really a problem? Critics say that without meaningful grading, students will have less motivation and knowledge of their strengths and weaknesses, while employers will struggle to differentiate between potential hires. Others, like my classmate Mark Oppenheimer, argue grade inflation is not a big deal. As he points out, some studies suggest grades are not effective motivators, and employers have already ditched grades as an important hiring criterion, without social collapse. Yale’s medical school, Mark notes, one of the top in the world, manages fine without grades. So does Hampshire College, a liberal arts school in Western Massachusetts.

Whether we want it or not, we’re effectively barreling towards a world without grades. Because of the collective action problem grading presents, only a top-down solution can handle the issue decisively and fairly.

Think back to how Fed chair Paul Volcker raised interest rates in the early 1980s to contain rampant inflation. It caused a recession—and thus great shared pain—but basically solved the problem. Likewise, students may have to accept the collective psychic shock of grade deflation in order to secure a robust grading system, if we want to preserve it at all.

Whether we want it or not, we’re effectively barreling towards a world without grades.

The epidemic of grade inflation provides a welcome opportunity to step back and reevaluate the role these marks play in the lives of students, recruiters, and educators alike. We may decide we want to revalue grades; we may decide to abolish them. Whatever the conclusion, thoughtful reflection on the role of academic feedback can only improve our educational institutions.

Time To Prepare For A Recession


After seven years of expansion, the US economy appears to be headed for a recession. Earlier this month, a weak jobs report bolstered fears that hiring has been slowing and hundreds of thousands have been dropping out of the labor force. Employment growth is running at half the pace it was in 2015. Meanwhile, the Fed’s broader Labor Market Conditions Index has been dropping, economists have lowered their estimates of job growth over the next 12 months, business capital investment is falling, the manufacturing sector is on the verge of contracting, and the economy grew at a rate of only 0.8% in the first three months of this year.

A number of other indicators that suggest oncoming recessions are blinking. While the yield curve on Treasuries may no longer be the clear guide to future downturns it once was, the difference between 2- and 10-year Treasury note yields is the smallest it’s been since before the last recession. This matters because it telegraphs investor pessimism via strong demand for long-term bonds. It also foretells a possible contraction of bank lending. How’s that? Because banks earn long-term yields and pay depositors short-term rates, a flattening yield curve implies shrinking profits – a pattern that usually results in banks lending less.

Depressing stats about the US economy are everywhere.  Corporate profits have declined—a dynamic that has historically been associated with recessions. In the last three months of 2015, corporate profits fell over 10% compared to the previous year, the biggest drop since 2008. Although they ticked up at the start of 2016, a tighter labor market could erode this progress. As Morgan Creek Capital Management chief executive Mark Yusko pointed out at John Mauldin’s Strategic Investment Conference in Dallas last month, the last few times consensus future earnings estimates for S&P 500 turned negative were 1990-1991, 2001, and 2008-2009—the last three recessions. It happened again at the end of 2015. Moreover, growth in household net worth has slowed to zero, a shift that has also coincided with past recessions.

Depressing stats about the US economy are everywhere. 

Yet not everyone is fixated on doom and gloom. As the economist Justin Wolfers pointed out, the dire jobs report was not as bad as it seemed, because seasonal factors and the Verizon strike may have made the situation look worse than it really was. Since the report, more than 35,000 Verizon workers are back to work. Demographic factors—baby boomer retirement in particular—also reduce the number of jobs we need to create each month to maintain our low level of unemployment, Wolfers added. On top of that, consumer spending is on the rise, service sector revenue has picked up, and data released last week showed that unemployment benefits claims had fallen.


In the aftermath of the jobs report, Fed chair Janet Yellen emphasized the positive over the negative, pointing to strong auto and housing markets, and signs of worker confidence. Despite her cautious optimism, we shouldn’t be surprised if growth turns negative. The current expansion is already one of the longest in modern US history. As Larry Summers points out, if history is any guide, a recession is more likely than not to happen in the next three years.

Even over the next 12 months, economists think there’s a 21% chance of recession.  A Wall Street Journal survey of economists emphasized four factors in particular that could lead to a contraction. First, the economy is already growing at a slow pace, meaning that otherwise minor issues—let alone major ones—could push it into negative territory. Second, many are concerned about a downturn in the Chinese economy, which could spread to the US. Third, declining business investment is making economists nervous; spending on capital goods has declined 12% since September 2014. Finally, they point to the uncertainty surrounding this year’s presidential race as a possible recession catalyst.


Other potential risks include the UK voting to leave the EU, a recession in Japan, an oil price shock, and an asset price collapse. Macro strategist Worth Wray also notes the possibility of European banks failing. Recently, George Soros has grabbed headlines making pessimistic bets on some of these risks, selling stocks and buying gold-related assets. A growing interest in the precious metal among high-profile investors is a sign of how widespread fear has become.

Although few economists in the Wall Street Journal survey mentioned Fed hawkishness as a recession risk, I worry about a major policy error by our regulators. A premature rate hike could send the American economy into a downturn, drive dollar strength, push commodities lower, and bring down global growth. If it does turn hawkish, the Fed risks repeating the mistake of 1937, when it threw the economy back into contraction by tightening too early.

If we do find ourselves in a recession, the Fed will have few tools left to deploy. Interest rates are already extremely low, leaving limited room for lowering the cost of money. It could experiment with negative rates, but that hasn't worked so well in Europe. Or it could try even more exotic maneuvers like helicopter money—in Ben Bernanke’s words, “an expansionary fiscal policy—an increase in public spending or a tax cut—financed by a permanent increase in the money stock.” Doing so would likely require coordination with Congress to allocate the funds. And no disrespect to our legislators, but they’re not exactly cooperative these days. Politics will constrain fiscal spending.


The shortfall of tools is just as worrying at the state-level. According to the Brookings Institution, last year "only eight states had accumulated enough in their rainy-day funds to offset a hypothetical one-year loss of 10 percent or more of their annual expenditures.” The blunt fact is that states are as ill-equipped to handle a downturn as is the federal government.

So while we have a potential recession to fear, our inability to fight it may be even scarier. 

What US Dollar Moves Mean For You


Last week, a bleak jobs report shattered expectations of a June rate hike and sent the dollar down 1.6%, highlighting the fragility of the Fed’s plan to normalize monetary policy. Headlines were quick to relate the fortunes of the yen and Japanese stocks, copper and oil, and Australian miners to the currency’s moves.

Because of the dollar’s global role as a reserve currency, its ups and downs link seemingly unconnected dynamics. Its movements affect everyone from Japanese investors and British intelligence analysts to Argentinian small business owners and American families. What are the forces to watch affecting the greenback—and, more importantly, what do the moves mean for you?

The dollar's movements affect everyone from Japanese investors and British intelligence analysts to Argentinian small business owners and American families.

There are numerous sources of upward pressure on the dollar. For one, the American currency is believed to be a safe haven, so in times of heightened geopolitical or economic uncertainty, global investors park money in US assets such as US Treasury bonds. With concerns mounting in many regions around the world, from the Middle East and Brazil to Europe and China, global capital may turn to the US for safety, increasing demand for dollars and driving the currency to appreciate.

On top of that, the Fed continues to signal it wants to raise rates this year, which would also cause the dollar to appreciate. Why’s that? Because higher interest rates attract international investors, just as savvy depositors move money to savings accounts with the highest yield.


Meanwhile, economic weakness in Europe and Japan means the Bank of Japan and the European Central Bank will likely maintain very accommodative monetary policies for the foreseeable future. This would mean keeping rates lower for longer, continuing unconventional policies such as quantitative easing, and depressing the euro and the yen relative to the dollar. The flip side of a weak euro and weak yen is a stronger dollar. These devaluations are by no means incidental; they’re an economic strategy. Cheaper currencies make a country’s exports more attractive, boosting growth in local currency terms. If central banks outside the United States continue to lean on currency devaluations to drive their economies, the dollar will rise unless the Fed fights back.

What does a stronger dollar mean for you? On the plus side, the stuff we import, such as manufactured goods and commodities, gets cheaper. As do overseas vacations. Take a trip to Argentina, and your dollars will buy almost 20% more than they did last year. The impact of a strong dollar is deflationary for Americans, because prices in dollar terms tend to fall.

Cheap goods are great for consumers, but the news is not so bright for US exporters. When the greenback appreciates, the cost of American goods sold overseas rises in local currency terms, hurting demand for US exports. A strong dollar shifts inflation to other countries by making their imports more expensive. Currency moves subtracted 4.5% from Pepsi’s global revenue at the start of this year, masking a far worse 26% decline in Latin America, and Coca-Cola expects exchange rates to take 2% or 3% from revenues in 2016.


For US exporters, there are some countervailing signs of hope that the dollar could weaken. Declining perceptions of US economic conditions drive down the greenback, as global investors sell their stakes in US companies, bringing the dollar down with them. With a dim jobs report, presidential election uncertainty, low productivity growth, and declining corporate profits, the outlook is bleak; indeed, economists are now predicting 2016 to be the slowest year for US growth since 2012. A downturn could force the Fed to push interest rates negative, which would drive down the dollar further.

A challenge to the dollar’s status as the dominant reserve currency would also put downward pressure on the greenback. Today, the dollar accounts for “90 percent of all foreign exchange transactions,” according to Reuters. If a currency like the Chinese yuan meaningfully ate into this share, it would surely lead to dollar weakness. But such a development might also make international trading less volatile, according to New York Fed President William Dudley.

What would a weaker dollar mean for you? Overseas vacations, imported manufactured goods, and commodities would rise in price. If you’re seeking authentic sushi, for instance, you’ll find it more expensive to visit Japan or dine on flown-in fish, since the dollar has declined more than 10% against the yen in the past year.


While this might seem like a drag, it could ultimately be better for the world economy. Rising commodity prices would help support the economies—and political systems—of countries relying on them to be high, supporting growth and stability. Developing countries would benefit as they could more easily handle dollar-denominated debt. And stronger emerging markets mean a stronger world economy. Global growth might accelerate.

The relative strength of the US dollar is not a black-and-white story for the US or the global economy as a whole. In fact, while a strong currency has its perks, a bigger-picture view suggests we may all be better off with a weaker dollar. Thanks to the world wide web of dollars, what happens on the other side of the globe has direct effects on your wallet. Whichever way it moves, the dollar’s fortunes are a reminder that in a complex, globally interconnected economy, it is crucial to look broadly to understand risks and spot opportunities.

Superbugs: The $100 Trillion Risk


Department of Defense researchers announced a shocking finding last month: they had identified an ominous, antibiotic-resistant strain of E. Coli in a Pennsylvania woman seeking medical treatment at a military clinic. This is the first time this bacteria has been positively identified in a human in America. The superbug is resistant to colistin, an antibiotic used only when alternatives have failed.

The announcement has prompted a host of statements highlighting the global significance of this finding. In an article published last week, scientists warned this “heralds the emergence of a truly pan-drug resistant bacteria.” As the director of the Centers for Disease Control and Prevention (CDC) put it, “It is the end of the road for antibiotics unless we act urgently.”

While the current superbug is thankfully treatable with other kinds of antibiotics,according to the Washington Post, “researchers worry that its colistin-resistance gene, known as mcr-1, could spread to other bacteria that can already evade other antibiotics.” In that case, we could be out of viable defenses.

Superbugs are already an enormous public health problem. According to Vox, “In the United States alone, antibiotic-resistant infections are now associated with 23,000 deaths and 2 million illnesses every year. By 2050, a report out of the UK suggested drug-resistant infections will kill more people than cancer.

The current superbug, which researchers had previously seen in Asia and Europe, is a reminder of the unintended consequences of the modern quest for productivity. Aggressive use of antibiotics in the meat production industry to maximize yield has accelerated the development of superbugs. 80% of antibiotics in the US are used on livestock.  North Carolina farm animals receive more of these drugs than all Americans combined.


Researchers believe that the colistin-resistant E. Coli originated in Chinese livestock farms, where the antibiotic is used to help fatten pigs. It has also infected at least one American farm animal: the Department of Health and Human Services recently spotted the same strain in a pig intestine in the US.

In addition to our quest for productivity, the development of superbugs also reveals another broader phenomenon: our widespread zeal for safety may be making us less safe. Take the rise of allergies in rich countries, for example. Many researchers believe that in scrambling to keep our children clean—and thus safe from disease—we have prevented their immune systems from gaining early exposure to germs, causing them to malfunction later in life. Kids who grow up on farms have lower rates of allergies, while those more often exposed to antibacterial soap have higher rates.

Likewise, over-prescription of antibiotics to treat human conditions is also contributing to the development of superbugs. Roughly a third of all antibiotics prescribed in doctors’ offices and hospitals are unnecessary—often the result of demanding patients or parents.

Roughly a third of all antibiotics prescribed in doctors’ offices and hospitals are unnecessary. 

To counteract the increasing threat of superbugs, politicians and doctors agree, we must restrain abuse in both agriculture and medicine and also stimulate new drug development. Doing so will impact a number of sectors.

An obvious area of opportunity is in agriculture. Sales of antibiotic-free meats jumped 20% in 2014-2015. Heightened consumer awareness of the superbug threat will only accelerate this trend, affecting the entire food value chain from restaurants and grocery stores to farmers and ranchers.

In addition to consumer pressure, regulations will also drive a shift to antibiotic-free meats. Last October, California passed a strict law limiting the use of antibiotics in agriculture. Farms that can’t adapt to these shifting consumer preferences and restrictive regulations will suffer.


The superbug threat might also change the economics of drug development. Today, creating new antibiotics is not particularly lucrative for the pharmaceutical industry. As a result, there are only 37 antibiotics in clinical development, compared with over 800 cancer drugs or vaccines. Increased awareness of the need for new antibiotics could change this. In January, a coalition of major drug companies called for governments to offer incentives in this space.

The economic model for new antibiotics must not be based on volume, since that would incentivize drug companies to encourage the same over-prescription problem we face today. Instead, lump-sum prizes for successful drug development, valued in the billions of dollars, could spur investment in antibiotic research and development without the misaligned incentives. And indeed, governments have been considering just these measures.

The development of superbugs will certainly ripple through disparate industries, from food service to medicine and agriculture. Barring a solution, one estimate tallies the cumulative hit to the global economy from antibiotic-resistant bacteria could be as high as $100 trillion.

Some, including economist Jim O'Neill, suggest the problem is as significant as climate change. The stakes are high enough that the problem could drive national and international mobilization on an unprecedented scale.  If not, we could experience life as it was before antibiotics revolutionized the world in the 20th century—“nasty, brutish, and short(er).” Or, as British Prime Minister David Cameron recently said at a G7 meeting, “the end of modern medicine as we know it.

Subscribe To My Newsletter