Before the
FEDERAL COMMUNICATIONS COMMISSION
Washington, D.C. 20554
In the Matter of
Service Rules for Advanced Wireless Services
In the 2155-2175 MHz Band
Service Rules for Advanced Wireless Services
In the 1915-1920 MHz, 1995-2000 MHz, 2020-2025 MHz and 2175-2180 MHz Bands
)
)
)
)
)
)
)
)
)
)
WT Docket No. 07-195
WT Docket No. 04-356
COMMENTS OF THE AMERICAN CIVIL LIBERTIES UNION
Caroline Fredrickson, Director
Michael Macleod-Ball, Chief Legislative and Policy Counsel
James Thomas Tucker, Policy Counsel
Christian Milan, United States of America Department of Justice Intern White House Fellow
American Civil Liberties Union
Washington Legislative Office
915 15th Street, N.W.
Washington, D.C. 20005-1313
(202) 544-1681
July 25, 2008
SUMMARY
The ACLU supports making broadband services more accessible to the public through universal access and expansion of the broadcast spectrum, including efforts to develop the vast unused portions of the wireless spectrum. The Commission should ensure that any licenses granted to utilize that spectrum, including the 2155-2175 MHz 1915-1920 MHz, 1995-2000 MHz, 2020-2025 MHz and 2175-2180 MHz bands that are the subject of the proposed rules in these matters, guarantee users access to the lawful Internet content of their choice, using applications and services of their choice. At the same time, the Commission should decline to impose unconstitutional conditions on license applications, such as a requirement for so-called “family friendly” filters that would censor lawful content.
COMMENTS OF THE AMERICAN CIVIL LIBERTIES UNION
The American Civil Liberties Union (ACLU) and the Technology and Liberty Project of the ACLU have been principal participants in nearly all of the Internet censorship and neutrality cases that have been decided by the United States Supreme Court in the past two decades, including Reno v. ACLU,1 Ashcroft v. ACLU,2 Ashcroft v. Free Speech Coalition,3 and the Brand X decision, in which the Court held that cable companies providing broadband Internet access were “information service providers” for purposes of regulation by the FCC under the Communications Act.4 We also have provided comments on several of the recent petitions filed with the Commission that implicate neutrality principles.
The ACLU agrees with many of the concerns raised by Free Press, the Media Access Project, New America Foundation, and Public Knowledge about the troubling conditions that the Petition and the Commission’s Notice of Proposed Rulemaking would impose on a free and open Internet. However, we will limit our own comments to the Commission’s proposed requirement for “family-friendly” filters as a condition of granting a license for the 2155-2175 MHz 1915-1920 MHz, 1995-2000 MHz, 2020-2025 MHz and 2175-2180 MHz bands.
1 521 U.S. 844 (1997) (striking down the Communications Decency Act and holding that the government cannot engage in blanket censorship in cyberspace).
2 542 U.S. 656 (2004) (upholding a preliminary injunction of the Child Online Protection Act, which imposed unconstitutionally overbroad restrictions on adult access to protected speech).
3 535 U.S. 234 (2002) (striking down restrictions on so-called “virtual child pornography”). The ACLU’s amicus brief is available at 2001 WL 740913 (June 28, 2001).
4 See National Cable & Telecomm. Ass’n v. Brand X Internet Serv., 545 U.S. 967 (2005). The ACLU’s amicus brief is available at 2005 WL 470933 (Feb. 22, 2005).
1
Commissioner Tate previously cautioned the Commission to “balance the needs of families in protecting their children with constitutional and statutory requirements.” WT Dkt. No. 07-195, Notice of Proposed Rulemaking at 86. Concerns about whether the proposed rule is unconstitutional are well-founded. In ACLU v. Ashcroft and ACLU v. Reno, the Supreme Court struck down similar content-based restrictions in the Child Online Protection Act (“COPA”), Pub. L. No. 105-277, 112 Stat. 2681, and the Communications Decency Act of 1996 (“CDA”), P.L. No. 104-104, 110 Stat. 103. Like the proposed “family friendly” filter, COPA and CDA were unconstitutional attempts by the government to impose mandatory regulations on the Internet to protect children. In this context, it does not matter that the censorship would be by a private licensee, if it is included by the Commission as a condition for issuing the license.
I. THE PROPOSED RULE IS A CONTENT-BASED RESTRICTION ON SPEECH SUBJECT TO STRICT SCRUTINY REVIEW.
The proposed rule would impose content-based restrictions on speech by requiring automatic filters that block access to pornographic, obscene, indecent material, as well as “any images or text that otherwise would be harmful to teens and adolescents.” In ACLU v. Reno, the Court made it clear that the Internet is subject to the same constitutional standards that apply to content-based restrictions through other modes of communications.5 “Sexual expression which is indecent but not obscene is protected by the First Amendment.” 6 Therefore, strict scrutiny applies to the proposed content-based
5 See 521 U.S. at 870 (“our cases provide no basis for qualifying the level of First Amendment scrutiny that should be applied” to the Internet).
6 Sable Commc’ns of Cal., Inc. v. FCC, 492 U.S. 115, 126 (1989).
2
regulation of speech, requiring the Commission to establish that it is the least restrictive means of furthering a compelling governmental interest.7
We can assume, without further comment, that the government has a compelling interest in protecting minors.8 But “even where speech is indecent and enters the home, the objective of shielding children does not suffice to support a blanket ban if the protection can be accomplished by a less restrictive alternative.”9 And even if the speech is not completely banned but merely burdened, the restrictions nevertheless are subject to strict scrutiny review.10 The Commission therefore bears the burden of demonstrating that the proposed regulation is the least restrictive means of accomplishing a compelling government interest.11 The Commission cannot meet this burden because the proposed rule’s content-based restriction deprives adults of access to protected speech.
II. THE PROPOSED RULE WOULD PROHIBIT ADULT ACCESS TO MATERIALS AND CONTENT PROTECTED BY THE FIRST AMENDMENT.
The Commission’s proposed rule would automatically block access to “pornographic, obscene, and indecent material and material that is unsuitable for minors. However, the rule offers no guidance of what would meet those definitions.
7 Id.
8 Of course, the Commission would have the burden of establishing that protecting minors is a compelling interest. The proposed rule treats all minors the same way. For example, an image from a health textbook depicting the male and female bodies might be suitable for a 16 year old, but inappropriate for a five year old. By classifying all such images as unsuitable for minors, regardless of the child’s age or the circumstances of the material or image, the rule raises serious questions whether the Commission will be able to meet its burden.
9 United States v. Playboy Entm’t Group, Inc., 529 U.S. 803, 814 (2000).
10 See id. at 812 (“The distinction between laws burdening and laws banning speech is but a matter of degree. The Government’s content-based burdens must satisfy the same rigorous scrutiny as its content-based bans.”).
11 See id. at 818 (“When First Amendment compliance is the point to be proved, the risk of non-persuasion – operative in all trials – must rest with the Government, not with the citizen.”).
3
Determination of what constitutes unprotected “obscene” material must comply with the standard set out by the Court in Miller v. California:
(a) whether the average person, applying contemporary community standards, would find that the work, taken as a whole, appeals to the prurient interest; (b) whether the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by state law; and (c) whether the work, taken as a whole, lacks serious literary, artistic, political, or scientific value.12
The absence of guidance on how the proposed rule could meet the Miller standard raises a host of questions. Who is going to make that determination? Will the Commission do so? Will it be the service provider? How will it be accomplished consistent with the neutrality principles the Commission embraced in 2005?13 Additionally, what contemporary community standards are to be applied? Is the relevant community the worldwide community of Internet users, the local community in which the user resides, or some other community? Furthermore, what would be subject to being blocked? Would it be a single visible screen on which the material appears (which may be significantly less than a single Web page)? Would it include links to the Web site as a whole? Does it also include linked Web sites? Moreover, what state law prohibiting “patently offensive” materials would apply? Would it be the state in which the service provider is located? The location of whoever is responsible for creating or maintaining the Web page or site? Or would it be where the user is located? The same questions arise with respect to the proposed rule’s regulation of pornographic and indecent materials and materials unsuitable for minors.
12 413 U.S. 15, 24 (1973).
13 The Commission established “Four Freedoms” in its 2005 policy statement, including user “access to the lawful Internet content of their choice” and running “applications and services of their choice.” See http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-260435A1.pdf. 4
The Supreme Court found that a similarly overbroad restriction in the CDA made it impossible to apply Miller in any meaningful way:
The general, undefined terms “indecent” and “patently offensive” cover large amounts of non-pornographic material with serious educational or other value. Moreover, the “community standards” criterion as applied to the Internet means that any communication available to a nationwide audience will be judged by the standards of the community most likely to be offended by the message.14
The Court described several examples of constitutionally protected speech that would be barred by such a blanket prohibition: “discussions about prison rape or safe sexual practices, artistic images that include nude subjects, and arguably the card catalog of the Carnegie Library.”15 Similarly, it could apply to a link that a parent e-mailed to his seventeen year old college freshman on birth control even though “neither he, his child, nor anyone in their home community found the material ‘indecent’ or ‘patently offensive,’ if the college town’s community thought otherwise.”16 By doing so, it would give a heckler’s veto to communities with the most restrictive standards, even if the vast majority of other communities, including the user’s own community, disagree.
Even if it were possible to resolve the problems with the proposal under Miller, the Commission has not explained how all material that is protected by the First Amendment would continue to be accessible. The reason is simple: some constitutionally protected material and speech would necessarily be blocked by the filtering requirement. In the process, the automatic mandatory filter would chill protected speech in violation of the First Amendment. The proposed rule’s application to
14 Reno v. ACLU, 521 U.S. at 877-78.
15 Id. at 878.
16 Id. 5
pornographic, obscene, and indecent material and material unsuitable for minors is unworkable, and would inevitably deny adult users access to constitutionally protected materials and speech. As such, it is facially unconstitutional.17
III. THE PROPOSED RULE’S EXCEPTION FOR ADULT CUSTOMERS TO IDENTIFY THEMSELVES AS ADULTS EXACERBATES ITS VIOLATIONS OF THE FIRST AMENDMENT.
Another unconstitutional aspect of the proposed rule would permit customers to disable the automatic filter after providing proof that they are adults. Without providing such proof, adult users would be denied their right to materials and information protected by the First Amendment. The Supreme Court has struck down similar restrictions that attempt to “reduc[e] the adult population… to … only what is fit for children.”18 Specifically, in the context of the Internet, regardless of the stated interest in protecting children, “‘the level of discourse… cannot be limited to that which would be suitable for a sandbox.’”19
The Supreme Court already has found that requiring users to prove “that they are of the age of the majority” does not cure unconstitutional restrictions imposed on adults. In Reno, the Court summarized the many problems of a similar requirement in the CDA, which provided for age verification by requiring a user to provide a credit card number:
[T]he imposition of such a requirement “would completely bar adults who do not have a credit card and lack the resources to obtain one from accessing any blocked material”…. “There is evidence suggesting that
17 See id. at 874; see also Secretary of State of Md. v. Joseph H. Munson Co., Inc., 467 U.S. 947, 967-68 (1984) (“Where, as here, a statute imposes a direct restriction on protected First Amendment activity, and where the defect in the statute is that the means chosen to accomplish the State’s objectives are too imprecise, so that in all its applications the statute creates and unnecessary risk of chilling free speech, the statute is properly subject to a facial attack.”).
18 Denver Area Educ.Telecomms. Consortium, Inc. v. FCC, 518 U.S. 727, 759 (1996).
19 Reno v. ACLU, 521 U.S. at 875 (quoting Bolger v. Youngs Drug Prods. Corp., 463 U.S. 60, 74-75 (1983)).
6
adult users, particularly casual Web browsers, would be discouraged from retrieving information that required use of a credit card or password”…. An adult password requirement would impose significant burdens on noncommercial sites, both because they would discourage users from accessing their sites and because the cost of creating and maintaining such screening systems would be “beyond their reach”….
“Even if credit card verification or adult password verification were implemented, the Government presented no testimony as to how such systems could ensure that the user of the password or credit card is in fact over 18.”20
The Court observed that ‘[t]hese limitations must inevitably curtail a significant amount of adult communication on the Internet.”21 As a result, Reno concluded that “there is no effective way to determine the identity or the age of a user who is accessing material” online.22 In Ashcroft, the Court upheld a preliminary injunction prohibiting enforcement of COPA,23 in part because the adult identification requirements it included – like those in the CDA – did not “constitute the sort of ‘narrowly tailoring’ that will save an otherwise patently invalid unconstitutional provision.”24
Moreover, requiring credit card or age-verification screening for access to the filter to disable it would severely burden the expression of both users and content providers. In the COPA challenge, the district court found that “consumers on the Web do not like the invasion of privacy from entering personal information” and that any requirement they do so “would have a negative effect on users because it will reduce
20 Reno v. ACLU, 521 U.S. at 856-57 (quoting the district court’s opinion).
21 Id. at 877.
22 Id. at 855.
23 See 542 U.S. at 656.
24 Reno v. ACLU, 521 U.S. at 882.
7
anonymity to obtain the speech… resulting in a loss of traffic to Web sites.”25 It also would force users to disclose personal information to a third party prior to being afforded access to constitutionally protected speech. In the process, it would place users in an untenable position: protect their privacy and forgo access to constitutionally protected speech and information, or exercise their First Amendment rights and forgo privacy.
Anonymous speech is protected under the First Amendment.26 Federal courts have struck down identity requirements for other communications media regulated by the Commission. For example, the Third Circuit struck down a law requiring adults to obtain access codes or other identification numbers in order to place a call to a telephone message service:
[T]he First Amendment protects against government inhibition as well as prohibition. An identification requirement exerts an inhibitory effect, and such deterrence raises First Amendment issues comparable to those raised by direct state-imposes burdens or restrictions…. [It is enough to invalidate a law where it is shown that] access codes will chill the exercise of some users’ right to hear protected communications.27
Another court struck down a similar requirement under New Mexico law “because it prevents people from communicating and accessing information anonymously.”28 As the Supreme Court explained in Denver, conditioning speech on identification requirements “will further restrict viewing by subscribers who fear for their reputations should the operator, advertently or inadvertently, disclose the list of those who wish to watch the
25 ACLU v. Reno, 31 F. Supp.2d 473, 487, 491 (E.D. Pa. 1999); see also id. at 487 (“in general, users of the Web are reluctant to provide personal information to Web sites unless the are at the end of an online shopping experience and prepared to make a purchase.”).
26 See McIntyre v. Ohio Elections Comm’n, 514 U.S. 334 (1995) (recognizing that anonymous political speech is protected under the First Amendment and striking down a requirement that the speaker identify themselves).
27 Fabulous Assocs., Inc. v. Pennsylvania Pub. Util. Comm’n, 896 F.2d 780, 785-86 (3d Cir. 1990).
28 ACLU v. Johnson, 4 F. Supp.2d 1029, 1033 (D.N.M. 1998), aff’d, 194 F.3d 1149 (10th Cir. 1999). 8
‘patently offensive’ channel.”29 Any affirmative requirement for users to identify themselves and provide personal information as a condition for accessing material and information protected by the First Amendment is unconstitutional.
IV. VOLUNTARY FILTERS ARE A LESS RESTRICTIVE MEANS OF ALLOWING ADULTS TO RESTRICT THE ACCESS OF THEIR CHILDREN TO CONTENT ON THE INTERNET.
The proposed requirement for an automatic filter also is unconstitutional because there are less restrictive means available for parents to block their children’s access to protected indecent speech and materials on the Internet. Specifically, voluntary “[b]locking and filtering software is an alternative that is less restrictive… and in addition, likely more effective as a means of restricting children’s access to materials harmful to them.”30 As the Supreme Court explained in describing a voluntary filter in Ashcroft:
Under a filtering regime, adults without children may gain access to speech they have a right to see without having to identify themselves or provide their credit card information. Even adults with children may obtain access to the same speech on the same terms simply by turning off the filter on their home computers.31
The Court recognized that the government may encourage voluntary filtering by “enacting programs to promote the use of filtering software… [that] could give parents that ability without subjecting protected speech to severe penalties,” but the government may not make a filter mandatory.32 That is where the power to impose a filter on the content that children view rightfully belongs: with the parents, not the Commission, a licensee, or service provider.
29 518 U.S. at 754.
30 Ashcroft v. ACLU, 542 U.S. at 666-67.
31 Id. at 667.
32 Id. at 670 (emphasis added). 9
The proposed rule would be unprecedented in the Commission’s broadband licensing system. Never before has the Commission conditioned a license for the broadband spectrum on a requirement that the licensee impose a mandatory block or filter of an entire class of protected speech. Never before has the Commission placed limitations on adults who can remove a mandatory filter by requiring that they first pay the licensee to access constitutionally protected material. As explained above, the reason is obvious: such a censorship model plainly violates the First Amendment and would be struck down when challenged. A paternalistic rule that authorizes the Commission to parent the parents through a mandatory filter that many adults – and perhaps the overwhelming majority of adults33 – will not be able to disable is unwarranted and facially unconstitutional. We urge the Commission to avoid setting a dangerous precedent of imposing unconstitutional conditions on licensees in the broadband spectrum.
CONCLUSION
The ACLU and Technology and Liberty Project of the ACLU endorse efforts to exploit unused portions of the wireless broadband spectrum and to make the Internet more accessible for all Americans. However, even well-intentioned efforts to provide universal access or to make broadband services more affordable cannot come at the expense of the First Amendment. The Commission should reject all unconstitutional conditions on license applications for the 2155-2175 MHz 1915-1920 MHz, 1995-2000 MHz, 2020-2025 MHz and 2175-2180 MHz bands that are the subject of the two
33 We believe that based upon the evidence developed in Ashcroft, it is likely that the overwhelming majority of adult subscribers to a “free” Internet service undoubtedly would find a requirement to pay to view blocked content or to provide personal information a barrier to exercising their First Amendment rights. 10
11
Petitions. In particular, the Commission should remove any requirements that licensees implement automatic “family friendly” filters that only can be removed by adult customers forced to forego their privacy and anonymity. The Commission is entrusted with the public airwaves and wireless broadband spectrum, and as part of that trust must enforce neutrality rules that guarantee Internet access free of government or service provider censorship. We urge the Commission to act in a manner consistent with those principles.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment