All Posts

This page shows all posts from all authors. Selected posts also appear on the front page.

My Morning Pick-Me-Up

First thing this morning, I'm sitting in my bathrobe, scanning my inbox, when I'm jolted awake by the headline on a TechDirt story:

California Senator Wants to Throw Ed Felten in Jail

I guess I'll take the time to read that story!

Kevin Murray, a California legislator, has introduced a bill that would fine, or imprison for up to one year, any person who "sells, offers for sale, advertises, distributes, disseminates, provides, or otherwise makes available" software that allows users to connect to networks that can share files, unless that person takes "reasonable care" to ensure that the software is not used illegally. TechDirt argues that my TinyP2P program would violate the proposed law.

Actually, the bill would appear to apply to a wide range of general-purpose software:

"[P]eer-to-peer file sharing software" means software that once installed and launched, enables the user to connect his or her computer to a network of other computers on which the users of these computers have made available recording or audiovisual works for electronic dissemination to other users who are connected to the network. When a transaction is complete, the user has an identical copy of the file on his or her computer and may also then disseminate the file to other users connected to the network.

That definition clearly includes the web, and the Internet itself, so that any software that enabled a user to connect to the Internet would be covered. And note that it's not just the author or seller of the software who is at risk, but also any advertiser or distributor. Would TechDirt be committing a crime by linking to my TinyP2P page? Would my ISP be committing a crime by hosting my site?

The bill provides a safe harbor if the person takes "reasonable care" to ensure that the software isn't used illegally. What does this mean? Standard law dictionaries define "reasonable care" as the level of care that a "reasonable person" would take under the circumstances, which isn't very helpful. (Larry Solum has a longer discussion, which is interesting but doesn't help much in this case.) I would argue that trying to build content blocking software into a general-purpose network app is a fruitless exercise which a reasonable person would not attempt. Presumably Mr. Murray's backers would argue otherwise. This kind of uncertain situation is ripe for intimidation and selective prosecution.

This bill is terrible public policy, especially for the state that leads the world in the creation of innovative network software.

Enforceability and Steroids

Regular readers know that I am often skeptical about whether technology regulations can really be enforced. Often, a regulation that would make sense if it were (magically) enforceable, turns out to be a bad idea when coupled with a realistic enforcement strategy. A good illustrative example of this issue arises in Major League Baseball's new anti-steroids program, as pointed out by David Pinto.

The program bars players from taking anabolic steroids, and imposes mandatory random testing, with serious public sanctions for players who test positive. A program like this helps the players, by eliminating the competitive pressure to take drugs that boost on-the-field performance but damage users' health. Players are better off in a world where nobody takes steroids than in one where everybody does. But this is only true if drug tests can accurately tell who is taking steroids.

A common blood test for steroids measures T/E, the ratio of testosterone (T) to epitestosterone (E). T promotes the growth and regeneration of muscle, which is why steroids provide a competitive advantage. The body naturally makes E, and later converts it into T. Steroids are converted directly into T. So, all else being equal, a steroid user will have higher T/E ratio than a non-user. But of course all else isn't equal. Some people naturally have higher T/E ratios than others.

The testing protocol will set some threshold level of T/E, above which the player will be said to have tested positive for steroids. What should the threshold be? An average value of T/E is about 1.0. About 1% of men naturally have T/E of 6.0 or above, so setting the threshold at that level would falsely accuse about 1% of major leaguers. (Or maybe more – if T makes you a better baseball player, then top players are likely to have unusually high natural levels of T.) That's a pretty large number of false accusations, when you consider that these players will be punished, and publicly branded as steroid users. Even worse, nearly half of steroid users have T/E of less than 6.0, so setting the threshold there will give a violator a significant chance of evading detection. That may be enough incentive for a marginal player to risk taking steroids.

(Of course it's possible to redo the test before accusing a player. But retesting only helps if the first test mismeasured the player's true T/E level. If an innocent player's T/E is naturally higher than 6.0, retesting will only seem to confirm the accusation.)

We can raise or lower the threshold for accusation, thereby trading off false positives (non-users punished) against false negatives (steroid users unpunished). But it may not be possible to have an acceptable false positive rate and an acceptable false negative rate at the same time. Worse yet, "strength consultants" may help players test themselves and develop their own customized drug regimens, to gain the advantages of steroids while evading detection by the official tests.

Taking these issues into account, it's not at all clear that a steroid program helps the players. If many players can get away with using steroids, and some who don't use are punished anyway, the program may actually be a lose-lose proposition for the players.

Are there better tests? Will a combination of multiple tests be more accurate? What tests will Baseball use? I don't know. But I do know that these are the key questions to answer in evaluating Baseball's steroids program. It's not just a question of whether you oppose steroid use.

CBS Tries DRM to Block Criticism of Rathergate Report

Last week the panel investigating CBS's botched reporting about President Bush's military service released its report. The report was offered on the net in PDF format by CBS and its law firm. CBS was rightly commended for its openness in facing up to its past misbehavior and publicizing the report. Many bloggers, in commenting on the report and events that led to it, included quotes from the report.

Yesterday, Ernest Miller noticed that he could no longer copy and paste material from the report PDF into other documents. Seth Finkelstein confirmed that the version of the report on the CBS and law firm websites had been modified. The contents were the same but an Adobe DRM (Digital Restrictions Management) technology had been enabled, to prevent copying and pasting from the report. Apparently CBS (or its lawyers) wanted to make it harder for people to quote from the report.

This is yet another use of DRM that has nothing to do with copyright infringement. Nobody who wanted to copy the report as a whole would do so by copying and pasting – the report is enormous and the whole thing is available for free online anyway. The only plausible use of copy-and-paste is to quote from the report in order to comment, which is almost certainly fair use.

(CBS might reasonably have wanted to prevent modifications to the report file itself. They could have done this, within Adobe's DRM system, without taking away the ability to copy-and-paste material from the file. But they chose instead to ban both modification and copy-and-paste.)

This sort of thing should not be a public policy problem; but the DMCA makes it one. If the law were neutral about DRM, we could just let the technology take its course. Unfortunately, U.S. law favors the publishers of DRMed material over would-be users of that material. For example, circumventing the DRM on the CBS report, in order to engage in fair-use commentary, may well violate the DMCA. (The DMCA has no fair-use exception, and courts have ruled that a DMCA violation can occur even if there is no copyright infringement.)

Worse yet, the DMCA may ban the tools needed to defeat this DRM technology. Dmitry Sklyarov was famously jailed by the FBI for writing a software tool that defeated this very same DRM technology; and his employer, Elcomsoft, was tried on criminal charges for selling fewer than ten copies of that tool.

As it turns out, the DRM can apparently be defeated easily by using Adobe's own products. A commenter on Seth's site (David L.) notes that he was able to turn off the restrictions using Adobe Acrobat: "The properties showed it set to password security. I was goofin around and changed it to No Security adn it turned off the security settings. I then saved the pdf and reopened it and the security was gone.... Apparently forging documents is not all that CBS sucks at."

UPDATED (12:35 PM) to clarify: changed "cut-and-paste" to "copy-and-paste", and added the parenthesized paragraph.

Tagged:  

French Researcher Faces Criminal Charges for Criticizing Antivirus Product

Guillaume Tena, a researcher also known as Guillermito, is now being tried on criminal copyright charges, and facing jail time, in France. He wrote an article analyzing an antivirus product called Viguard, and pointing out its flaws. The article is in French, and standard online translators seem to choke on it. My French is poor at best so I have only a general idea of what it says. But it sure looks like the kind of criticism a skeptical security researcher would write.

This is a standard legal-attack-on-security-researcher story. Company makes grand claims for its product; security researcher writes paper puncturing claims; company launches rhetorical and legal attack on researcher; researcher's ideas get even wider attention but researcher himself is in danger. Everybody in the security research field knows these stories, and they do deter useful research, while further undermining researchers' trust in unsupported vendor claims.

At least one thing is unusual about Tena's legal case. Rather than being charged with violating some newfangled DMCA-like law, he is apparently being charged with old-fashioned copyright infringement (or the French equivalent) because his criticism incorporated some material that is supposedly derivative of the copyrighted Viguard software. Unlike some previous attacks on researchers, this one may not have been enabled by the recent expansion of copyright law. Instead, it would seem to be enabled by a combination of two factors: (1) Traditional copyright law allows such a case to be brought, even though Tena had not caused the kind of harm that copyright law is supposed to prevent; and this allowed (2) a decision by the authorities to single him out for prosecution because somebody was angry about what he wrote.

It's bad enough that Tegam, the company that created Viguard, is going after Tena. Why is the French government participating? Here's a hint: Tegam's statement plays on French nationalism:

TEGAM International has for many years been the only French company to design, develop, market and provide support for antivirus and security software in France. It has chosen a global approach to security, not relying on signature updates [a method used by the most popular U.S. antivirus products].

In the software sector, everybody knows that some people would like to exert their technological domination, and as a result crush any attempt to create an alternative. As the battle goes on to try to preserve and strengthen research in France, TEGAM International defends its difference and the results of its own research.

Tagged:  

Patent Holding Companies

Lately we've seen many complaints about the proliferation of patent holding companies, which buy patents, usually from small inventors, and then try to extract royalties, by negotiation or lawsuit, from companies that (allegedly) use the patented inventions. Often this is depicted as some kind of outrage. But from a policy standpoint I don't see a problem.

Now perhaps you believe that the patent system is irretrievably broken and ought to be scrapped or severely reformed. Perhaps you think it should be harder to bring patent lawsuits. If that's your position, then your policy effort should be spent on reforms that apply to all patent owners and all lawsuits, and not just on holding companies. Why focus specially on patent activity by holding companies, unless your goal is to disadvantage small inventors?

If, on the other hand, you buy into the goals of the patent system, and you think that the system, though imperfect, generally works, then it's hard to see the problem with holding companies. It seems sensible that the financial return for an invention ought to be the same, whether the inventor works for a big company or freelances in his garage. If the invention really is novel, non-obvious, and useful, then the inventor is entitled to reasonable royalties from people who use the patented technology. Why should small inventors face barriers that large inventors don't?

An inventor's ability to negotiate royalties depends, ultimately, on the threat that he will bring a lawsuit if the company using the invention doesn't agree to pay. Patent litigation is costly and time-consuming, especially if the defendant is using delay tactics. A freelance inventor can't credibly threaten to bring a suit without financial backing from somebody else. Litigation is risky, too, and the inventor may be risk-averse. The company using an invention knows these things, so a freelance inventor's lawsuit threat won't have much credibility, even if the suit would have merit. And so the freelance inventor won't be able to extract the royalties that a deeper-pocketed inventor could. It's often argued that the patent system unfairly favors large companies, for precisely this reason.

Why not allow an outside firm to invest in small inventors' patents, so as to provide the financial resources to support a potential suit and to absorb the risk? Coming from such a firm, a lawsuit threat would have suitable deterrent value. And so, most importantly, suchs will bid against each other for small inventors' patents. Holding companies can level the playing field by helping small inventors extract the true value of their inventions.

Beyond this, holding companies may develop expertise in patent valuation or negotiating royalties. Holding companies that specialize in valuation and revenue-extraction allow small inventors to specialize in what they do best, which is inventing. This would mirror the structure in large companies, where one subgroup of people handles invention and another handles revenue-extraction. Why treat the small inventor differently from the large one?

Though there is no good policy argument for disadvantaging small inventors, we may see such changes anyway, due to rent-seeking by large companies. Those who support rational patent policy should focus on setting up the right patent rules (whatever they are), and applying those rules to whoever happens to own each patent.

Tagged:  

Whom Should We Search at the Airport?

Here's an interesting security design problem. Suppose you're in charge of airport security. At security checkpoints, everybody gets a primary search. Some people get a more intensive secondary search as a result of the primary search, if they set off the metal detector or behave suspiciously during the primary search. In addition, you can choose some extra people who get a secondary search even if they look clean on the primary search. We'll say these people have been "selected."

Suppose further that you're given a list of people who pose a heightened risk to aviation. Some people may pose such a serious threat that we won't let them fly at all. I'm not talking about them, just about people who pose a risk that is higher than average, but still low overall. When I say these people are "high-risk" I don't mean that the risk is high in absolute terms.

Who should be selected for secondary search? The obvious answer is to select all of the high-risk people, and some small fraction of the ordinary people. This ensures that a high-risk person can't fly without a secondary search. And to the extent that our secondary-searching people and resources would otherwise be idle, we might as well search some ordinary people. (Searching ordinary people at random is also a useful safeguard against abusive behavior by the searchers, by ensuring that influential people are occasionally searched.)

But that might not be the best strategy. Consider the problem faced by a terrorist leader who wants to get a group of henchmen and some contraband onto a plane in order to launch an attack. If he can tell which of his henchmen are on the high-risk list, then he'll give the contraband to a henchman who isn't on the list. If we always select people on the list, then he can easily detect which henchmen are on the list by having the henchmen fly (without contraband) and seeing who gets selected for a secondary search. Any henchman who doesn't get selected is not on the high-risk list; and so that is the one who will carry the contraband through security next time, for the attack.

The problem here is that our adversary can probe the system, and use the results of those probes to predict our future behavior. We can mitigate this problem by being less predictable. If we decide that people on the high-risk list should be selected usually, but not always, then we can introduce some uncertainty into the adversary's calculation, by forcing him to worry that a henchman who wasn't selected the first time might still be on the high-risk list.

The more we reduce the probability of searching high-risk people, the more we increase the adversary's uncertainty, which helps us. But we don't want to reduce that probability too far – after all, if we trick the terrorist into giving the contraband to a high-risk henchman, we still want a high probability of selecting that henchman the second time. Depending on our assumptions, we can calculate the optimal probability of secondary search for high-risk people. That probability will often be less than 100%.

But now consider the politics of the situation. Imagine what would happen if (God forbid) a successful attack occurred, and if we learned afterward that one of the attackers had carried contraband through security, and that the authorities knew he posed a hightened risk but chose not to search him due to a deliberate strategy of not always searching known high-risk people. The recriminations would be awful. Even absent an attack, a strategy of not always searching is an easy target for investigative reporters or political opponents. Even if it's the best strategy, it's likely to be infeasible politically.

Tagged:  

The "Pirate Pyramid"

This month's Wired runs a high-decibel piece by Jeff Howe, on topsites and their denizens:

When Frank ... posted the Half-Life 2 code to Anathema, he tapped an international network of people dedicated to propagating stolen files as widely and quickly as possible.

It's all a big game and, to hear Frank and others talk about "the scene," fantastic fun. Whoever transfers the most files to the most sites in the least amount of time wins. There are elaborate rules, with prizes in the offing and reputations at stake. Topsites like Anathema are at the apex. Once a file is posted to a topsite, it starts a rapid descent through wider and wider levels of an invisible network, multiplying exponentially along the way. At each step, more and more pirates pitch in to keep the avalanche tumbling downward. Finally, thousands, perhaps millions, of copies - all the progeny of that original file - spill into the public peer-to-peer networks: Kazaa, LimeWire, Morpheus. Without this duplication and distribution structure providing content, the P2P networks would run dry.

The story paints this as a sort of organized-crime scene, akin to a drug cartel, in which a great many people conspire, via some kind of command-and-control network, to achieve the widest distribution of the product. If true, this would be good news for law enforcers – if they chopped off the organization's head, "the P2P networks would run dry."

But this is wrong way to interpret the facts, at least as I understand them. The topsites are exclusive clubs whose members compete for status by getting earlier, better content. The main goal is not to seed the common man's P2P net, but to build status and share files within a small group. Smebody on the fringe of the group can grab a file and redistribute it to less exclusive club, as a way of building status within that lesser club. Then somebody on the fringe of that club can redistribute it again; and so on. And so the file diffuses outward from its source, into larger and less exclusive clubs, until eventually everybody can get it. The file is distributed not because of a coordinated conspiracy, but because of the local actions of individuals seeking status. The whole process is organized; but it's organized like a market, not like a firm.

[It goes without saying that all of this is illegal. Please don't mistake my description of this behavior for an endorsement of it. It's depressing that this kind of disclaimer is still necessary, but I have learned by experience that it is.]

What puts some people at the top of this pyramid, and others at the bottom? It's not so much that the people at the bottom are incapable of injecting content into the system; it's just that the people at the top get their hands on content earlier. Content trickles down to the P2P nets at the bottom of the pyramid, often arriving there before the content is available by other means to ordinary members of the public. Once a song or movie is widely available, there's no real reason for an ordinary user to rip their own copy and inject it.

The upshot is that enforcement against the top of the pyramid would have some effect, but much less than the Wired article implies. The main effect would be to delay the arrival of content in the big P2P networks, at least for a while, by blocking early leaks of content from the studios and production facilities. The files would still show up – there are just too many sources – but the copyright owners would gain a short interval of exclusivity before the content showed up on P2P. Certainly the P2P networks would not "run dry."

Don't get me wrong. Law enforcers should go after the people at the top of the pyramid. At least they would be making examples of the right people. But we should recognize that the rivers of P2P will continue to overflow.

UPDATE (7:25 PM): Jeff Howe, author of the Wired article, offers a response in the comments.

Tagged:  

BSA To Ask For Expansion of ISP Liability

The Business Software Alliance (BSA), a software industry group, will ask Congress to expand the liability of ISPs for infringing traffic that goes across their networks, according to a Washington Post story by Jonathan Krim.

The campaign to modify the law is part of a broader effort by the BSA to address a variety of copyright and patent issues. In a report to be released today, the group outlines its concerns but offers no specifics on how the 1998 law should be changed. But in an interview, [Adobe chief Bruce] Chizen and BSA Executive Director Robert Holleyman said Internet service providers should no longer enjoy blanket immunity from liability for piracy by users.

The article doesn't make clear what limits BSA would put on ISP liability. Making ISPs liable for everything that goes over their networks would be a death blow to ISPs, because there is no way to look at a file and tell what might be hidden in it. (Don't believe me? Then tell me what is hidden in this file.) Actually, BSA members sell virtual private network software that hides messages from ISPs.

So the BSA must want something less than total liability. Perhaps they want to expand the DMCA subpoena-bot rule so that ISPs have to turn over a customer's name on demand. The music industry once claimed that the existing DMCA rule requires that, but the courts disagreed. Congress could amend the DMCA to override that court decision.

Or perhaps they want to hold ISPs liable unless they deploy filtering and blocking technologies to try to stop certain files from circulating and certain protocols from being used. These technologies are only stopgap measures that would soon be overcome by P2P designers, so requiring their deployment seems like bad policy.

Most likely, this is just a tactic to put political pressure on ISPs, in the hope of extracting some concessions. I predict that either (a) this will go nowhere, or (b) ISPs will agree to allow an expansion of the subpoena-bot rule.

Predictions for 2005

Here is my list of twelve predictions for 2005.

(1) DRM technology, especially on PCs, will be seen increasingly as a security and privacy risk to end users.

(2) Vonage and other leading VoIP vendors will start to act like incumbents, welcoming regulation of their industry sector.

(3) Internet Explorer will face increasing competitive pressure from Mozilla Firefox. Microsoft's response will be hamstrung by its desire to maintain the fiction that IE is an integral part of the operating system.

(4) As blogs continue to grow in prominence, we'll see consolidation in the blog world, with major bloggers either teaming up with each other or affiliating with major news outlets or web sites.

(5) A TV show or movie that is distributed only on the net will become a cult hit.

(6) The Supreme Court's Grokster decision won't provide us with a broad, clear rule for evaluating future innovations, so the ball will be back in Congress's court.

(7) Copyright issues will be stalemated in Congress.

(8) There will be no real progress on the spam, spyware, and desktop security problems.

(9) Congress will address the spyware problem by passing a harmless but ineffectual law, which critics will deride as the "CAN-SPY Act."

(10) DRM technology will still fail to prevent widespread infringement. In a related development, pigs will still fail to fly.

(11) New P2P systems will marry swarming distribution (as in BitTorrent) with distributed indexing (as in Kazaa et al). Copyright owners will resort to active technical measures to try to corrupt the systems' indices.

(12) X-ray vision technology will become more widely available (though not to the general public), spurring a privacy hoohah.

Tagged:  

2004 Predictions Scorecard

A year ago, I offered seven predictions for 2004. Today, as penance for sins committed in 2004, it's my duty to exhume these predictions and compare them to reality.

(1) Some public figure will be severely embarrassed by an image taken by somebody else's picture-phone or an audio stream captured by somebody else's pocket audio recorder. This will trigger a public debate about the privacy implications of personal surveillance devices.

The Abu Ghraib images seem to fit the bill here: pictures taken by a phonecam that severely embarass a public figure. When I made this prediction, I had in mind pictures or recordings of the public figure in question, but what the prediction as written wasn't too far off.

Verdict: mostly right.

(2) The credibility of e-voting technologies will continue to leak away as more irregularities come to light. The Holt e-voting bill will get traction in Congress, posing a minor political dilemma for the president who will be caught between the bill's supporters on one side and campaign contributors with e-voting ties on the other.

E-voting technologies did lose credibility as predicted. The Holt bill did gain some traction but was never close to passing. Republicans did feel some squeeze on this issue, and it became a bit of a partisan issue. (Now that the 2004 election is past, there is more hope for e-voting reform.)

Verdict: mostly right.

(3) A new generation of P2P tools that resist the recording industry's technical countermeasures will grow in popularity. The recording industry will respond by devising new tactics to monitor and unmask P2P infringers.

P2P tools did evolve to resist technical countermeasures, for instance by using hashes to detect spoofed files. The recording industry is only now starting to change tactics. The big P2P technology of the year was BitTorrent, whose main innovation was in dispersing the bandwidth load required to distribute large files, rather than in evading countermeasures. Indeed, BitTorrent made possible a new set of countermeasures, which the copyright owners adopted near the end of the year.

Verdict: mostly right.

(4) Before the ink is dry on the FCC's broadcast flag order, the studios will declare it insufficient and ask for a further mandate requiring watermark detectors in all analog-to-digital converters. The FCC will balk at the obvious technical and economic flaws in this proposal.

The studios did seem to want a watermark-based system to close the analog hole, but they were held back by its total infeasibility. My main error here was to misjudge the time scale.

Verdict: mostly wrong.

(5) DRM technology will still be ineffective and inflexible. A few people in the movie industry will wake up to the hopelessness of DRM, and will push the industry to try another approach. But they won't be able to overcome the industry's inertia � at least not in 2004.

DRM technology was nearly useless, as predicted. We're starting to hear faint rumblings within the movie industry that a different approach would be wise. But, as predicted, the industry isn't paying much attention to them.

Verdict: right.

(6) Increasingly, WiFi will be provided as a free amenity rather than a paid service. This will catch on first in hotels and cafes, but by the end of the year free WiFi will be available in at least one major U.S. airport.

Even some New Jersey diners now offer free WiFi. The Pittsburgh airport has offered free WiFi for nearly a year. And some airline clubrooms offer free WiFi that is accessible from nearby terminal areas.

Verdict: right.

(7) Voice over IP (VoIP) companies like Vonage will be the darlings of the business press, but the most talked-about VoIP-related media stories will be contrarian pieces raising doubt about the security and reliability implications of relying on the Internet for phone service.

VoIP got plenty of attention, but these companies were not "darlings of the business press". Security/reliability contrarian stories didn't get much play. This prediction went too far.

Verdict: mostly wrong.

Overall score: two right, three mostly right, two mostly wrong, none wrong. I'm a bit surprised to have done so well. Obviously this year's predictions need to be more outrageous. I'll offer them later in the week.

[UPDATE (1:15 PM): I originally wrote that the first prediction was wrong. Then an anonymous commenter pointed out that Abu Ghraib would qualify. See also the incident in India referenced in the comments.]

Tagged:  
Syndicate content