Tag Archives: technology

I’m Shocked, Shocked!

By James Kwak

Technology-land is abuzz these days about net neutrality: the idea, supported by President Obama, (until recently) the Federal Communications Commission, and most of the technology industry, that all traffic should be able to travel across the Internet and into people’s homes on equal terms. In other words, broadband providers like Comcast shouldn’t be able to block (or charge a toll to, or degrade the quality of), say, Netflix, even if Netflix competes with Comcast’s own video-on-demand services.*

Yesterday, the Wall Street Journal reported that the FCC is about to release proposed regulations that would allow broadband providers to charge additional fees to content providers (like Netflix) in exchange for access to a faster tier of service, so long as those fees are “commercially reasonable.” To continue our example, since Comcast is certainly going to give its own video services the highest speed possible, Netflix would have to pay up to ensure equivalent video quality.

Jon Brodkin of Ars Technica has a fairly detailed yet readable explanation of why this is bad for the Internet—meaning bad for the choices available to ordinary consumers and bad for the pace of innovation in new types of content and services. Basically it’s a license to the cable providers to exploit a new revenue source, with no commitment to use those revenues to actually upgrade service. (With an effective monopoly in many metropolitan areas and speeds already faster than satellite, the local cable provider has no market pressure to upgrade service, at least not until fiber becomes more widespread.) The need to pay access fees will make it harder for new entrants on the content and services side; in the long run, these fees could actually be good for Netflix, since it won’t have to worry as much about competition. The ultimate result will be to lock in the current set of incumbents that control the Internet, ushering in the era of big, fat, incompetent monopolies.

Continue reading

Software Is Great; Software Has Bugs

By James Kwak

I’m not qualified to comment on the internals of Bitcoin; I’m neither a programmer (OK, Alex, not much of a programmer) nor a computer scientist. But I do know that Bitcoin exists because of software that people wrote, and every means by which we use Bitcoin also operates because of software that people wrote. The problem here is the “people” part—people make mistakes under the best of circumstances, and especially when they have an economic incentive to rush out products. That’s why, while we love what software can do for us, we also like having a safety net—like, say, the human pilots who can take over a plane if its computers crash. This is the subject of my latest column over at The Atlantic. Enjoy.

Random Variation

By James Kwak

As I previously wrote on this blog, one of my professors at Yale, Ian Ayres, asked his class on empirical law and economics if we could think of any issue on which we had changed our mind because of an empirical study. For most people, it’s hard. We like to think that we form our views based on evidence, but in fact we view the evidence selectively to confirm our preexisting views.

I used to believe that no one could beat the market: in other words, that anyone who did beat the market was solely the beneficiary of random variation (a winner in Burton Malkiel’s coin-tossing tournament). I no longer believe this. I’ve seen too many studies that indicate that the distribution of risk-adjusted returns cannot be explained by dumb luck alone; most of the unexplained outcomes are at the negative end of the distribution, but there are also too many at the positive end. Besides, it makes sense: the idea that markets perfectly incorporate all available information sounds too much like magic to be true.

But that doesn’t mean that everyone who beats the market is actually good at what he does, even if that person gets a $100 million annual bonus. That person would be Andy Hall, the commodities trader who stirred up controversy when he apparently earned a $100 million bonus at Citigroup—in 2008, of all years. (That was a year with huge volatility in the commodities markets.)

Continue reading

No You Can’t

By James Kwak

Yesterday the Obama administration announced that healthcare.gov “will work smoothly for the vast majority of users.” Presumably they intended this as some sort of victory announcement after their self-imposed deadline of December 1 to fix the many problems uncovered when the site went live two months ago. But anyone who knows anything about software knows that it’s not enough to “work smoothly” for the “vast majority” of users.

Apparently pages are now loading incorrectly less than 1 percent of the time. Well, how much less? Pages failing 1 percent of the time make for a terrible web experience, especially for a web site where you have to travel through a long sequence of pages. There is evident fear that the current site will not be able to handle any type of significant load, like it will get around the deadline to sign up for policies beginning on January 1. And we know that “the back office systems, the accounting systems, [and] the payment systems”—in other words, the hard stuff—are still a work in progress.

None of this should come as any surprise—except to the politicians, bureaucrats, and campaign officials who run healthcare.gov. The single biggest mistake in the software business is thinking that if you throw resources at a problem and work really, really hard and put lots of pressure on people, you can complete a project by some arbitrary date (like December 1). It’s not like staying up all night to write a paper in college. This isn’t just a mistake made by people like the president of the United States. It’s made routinely by people in the software business, whether CEOs of software companies who made their way up through the sales ranks, or CIOs of big companies who made their way up as middle managers. You can’t double the number of people and cut the time in half. And just saying something is really, really important won’t make it go any faster or better.

Continue reading

Bad Government Software

By James Kwak

Ezra Klein, one of the biggest supporters of Obamacare the statute, has already called the launch of Obamacare a “disaster,” and it looks like things are now getting worse: as people are actually able to buy insurance, the data being passed to health insurers are riddled with errors (something Klein anticipated), in effect requiring applications to be verified over the phone. Bad software is one of my blogging sidelights, so I wanted to find out who built this particular example, and I found Farhad Manjoo’s WSJ column, which fingered CGI, a big old IT consulting firm (meaning that they do big, custom, software development projects, mainly for big companies). (See here for more on CGI.)

CGI was a distant competitor of my old company. I don’t recall facing them head-to-head in any deals (although my memory could be failing me), but they claimed to make insurance claim systems, which is the business we were in. So I don’t have an opinion on them specifically, but I do have an opinion on the general category of big IT consulting firms: they do crappy work, at least when they are building systems from scratch. (They generally do better when installing products developed by real software companies.)

Continue reading

Can You Say “Bubble”?

By James Kwak

Yesterday’s Wall Street Journal had an article titled “Foosball over Finance” about how people in finance have been switching to technology startups, for all the predictable reasons: The long hours in finance. “Technology is collaborative. In finance, it’s the opposite.” “The prospect of ‘building something new.'” Jeans. Foosball tables. Or, in the most un-self-conscious, over-engineered, revealing turn of phrase: “The opportunity of my generation did not seem to be in finance.”

We have seen this before. Remember Startup.com? That film documented the travails of a banker who left Goldman to start an online company that would revolutionize the delivery of local government services. It failed, but not before burning through tens of millions of dollars of funding. There was a time, right around 1999, when every second-year associate wanted to bail out of Wall Street and work for an Internet company.

The things that differentiate technology from banking are always the same: the hours (they’re not quite as bad), the work environment, “building something new,” the dress code, and so on. They haven’t changed in the last few years. The only thing that changes are the relative prospects of working in the two industries—or, more importantly, perceptions of those relative prospects.

Wall Street has always attracted a particular kind of person: ambitious but unfocused, interested in success more than any achievements in particular, convinced (not entirely without reason) that they can do anything, and motivated by money largely as a signifier of personal distinction. If those people want to work for technology startups, that means two things. First, they think they can amass more of the tokens of success in technology than in finance.

Second—since these are the some of the most conservative, trend-following people that exist—it means they’re buying at the top.

The Importance of Excel

By James Kwak

I spent the past two days at a financial regulation conference in Washington (where I saw more BlackBerries than I have seen in years—can’t lawyers and lobbyists afford decent phones?). In his remarks on the final panel, Frank Partnoy mentioned something I missed when it came out a few weeks ago: the role of Microsoft Excel in the “London Whale” trading debacle.

The issue is described in the appendix to JPMorgan’s internal investigative task force’s report. To summarize: JPMorgan’s Chief Investment Office needed a new value-at-risk (VaR) model for the synthetic credit portfolio (the one that blew up) and assigned a quantitative whiz (“a London-based quantitative expert, mathematician and model developer” who previously worked at a company that built analytical models) to create it. The new model “operated through a series of Excel spreadsheets, which had to be completed manually, by a process of copying and pasting data from one spreadsheet to another.” The internal Model Review Group identified this problem as well as a few others, but approved the model, while saying that it should be automated and another significant flaw should be fixed.** After the London Whale trade blew up, the Model Review Group discovered that the model had not been automated and found several other errors. Most spectacularly,

“After subtracting the old rate from the new rate, the spreadsheet divided by their sum instead of their average, as the modeler had intended. This error likely had the effect of muting volatility by a factor of two and of lowering the VaR . . .”

Continue reading