Transnational Jurisdiction in Cyberspace



Download 230.47 Kb.
Page1/4
Date20.04.2016
Size230.47 Kb.
  1   2   3   4
Transnational Jurisdiction

in

Cyberspace

Data Protection Laws and Jurisdiction


Transnational Jurisdiction in Cyberspace

Data Protection Laws and Jurisdiction

Introduction

This portion of the American Bar Association project on “Transnational Jurisdiction in Cyberspace” concerns the application of data protection laws, designed to protect aspects of personal privacy, in the context of cross-border electronic commerce.

Electronic commerce offers unique opportunities to collect, store, and “mine” information about consumers, and to use that information to make the interaction between consumer and vendor more efficient. For example, the consumer’s subsequent visits to a website can be streamlined and tailored. This is made possible by using stored information from one or more typical sources:


  • information furnished originally by the consumer on a registration page,

  • data automatically recorded in a “cookie” file that is stored on the consumer’s computer (which might include identification, hardware and software capabilities, passwords and access codes),

  • information automatically collected online by reviewing other files on the consumer’s computer,

  • a transaction record of the consumer’s purchases, payments, license and warranty registrations, and e-mail communications,

  • “clickstream” data automatically recording the sequence of the consumer’s visits to pages and links on the site,

  • possibly consumer-specific or psycho-demographic modeling data obtained from third parties, such as individual reference services or marketing companies, to match with site-generated data.

Such information allows the vendor to reduce delays during site visits and eliminate the need for re-entering personal data on successive pages or in successive visits. Personal data may be necessary to authenticate the identity of a customer, process electronic payments, extend credit online, or to allow the consumer to track product delivery or verify account activity. Stored personal data may also be used to provide appropriate after-sale service, such as responding to technical questions, furnishing warranty service, and providing information about product upgrades or alerts. Analysis of individual transactions and preferences allows vendors to fine-tune their product offerings and advertising, and to target their direct marketing to the elusive “market of one.” Not surprisingly, consumer data has become a prime asset in electronic commerce.

At the same time, polls and informal surveys repeatedly show that consumers are often troubled by concerns that their stored personal data will be intercepted or misused, leading to embarrassment, harassment, fraud, credit card theft, erroneous judgments of their credit standing, or a flood of unwanted solicitations. Such concerns (which arise from both online and offline consumer profiling techniques) have motivated diverse legislation designed to protect aspects of informational or data privacy, as well as voluntary, contractual, and self-regulatory practices designed to reassure prospective customers.

These data protection measures, however, vary considerably from one jurisdiction to another, and often from one business sector to another. Consumers online frequently deal directly with vendors located in distant states or countries, where privacy laws and expectations differ from their own. In those cases, whose data protection laws and which enforcement mechanisms should govern the collection and use of personal information?

Our analysis starts in Part I with (A) a summary of the relatively new legal protections of informational or data privacy and rights to control direct commercial solicitations, as reflected in the laws of the United States, the European Union, and a sampling of other countries. We describe (B) how courts and regulators are likely to resolve conflicts of data protection law offline and (C) how traditional conflicts analysis is likely to be applied (and possibly modified) in the context of cross-border electronic commerce. In Part II we consider whether these results are likely to be predictable, fair, or efficient, and how they might produce adverse consequences for consumers and vendors in electronic commerce. Finally, in Part III, we offer some suggestions as to how conflicts analysis should be applied to data protection in cyberspace, and how viable alternatives to judicial or regulatory enforcement actions could evolve that would render the jurisdictional analysis irrelevant in most cases.


  1. Data Protection and Jurisdictional Conflicts

There are many laws and legal doctrines that in some way concern notions of personal privacy. For purposes of this analysis, we are focussing on data collection only -- the collection and use of personal information relating to individual consumers for commercial purposes.1

Laws and announced voluntary practices in this area are relatively new, largely dating from the 1970s. In the United States, the evolution of consumer information or data privacy practices has largely been left to the marketplace, except for sectoral legislation governing specific data (such as telephone and cable television subscriber information) or specific uses of data (such as consumer credit reports), which are often regulated in considerable detail. In most of Europe, by contrast, and in a growing number of countries in other parts of the world, the trend has been to lay down broad, general principles for personal data protection in “omnibus” laws that cover most or all commercial activities. These laws are enforced chiefly by specialized regulatory bodies (generically termed the “Data Protection Authorities”), but they are increasingly enforceable in private court actions as well. As shown below, despite similar underlying principles, the relevant laws and enforcement mechanisms in the US and Europe differ in important details and are hard to reconcile. This raises conflicts issues that are multiplied and in some respects made more complex in the context of global electronic commerce.
A. US and European Approaches to Consumer Data Protection
In the United States (and this is also largely true of Canada, Australia, Japan, and Latin America), the “self-regulatory” approach to data protection in commercial transactions has been characterized, until very recently, by reliance on market forces to determine where to strike the balance between consumer data protection concerns and economically valuable uses of consumer data. Commercial data protection practices in these countries are largely driven by contractual promises and by concerns over business reputation, as well as compliance with consumer credit reporting laws and other specific legislation. Often, consumers are presented with a quite real choice between preserving greater anonymity or obtaining something of value (an instant credit line, for example, or a discount on goods when using a store card that tracks purchases).
In Europe, by contrast, and in jurisdictions that have adopted European-style data protection laws (including Quebec, the Hong Kong SAR, Taiwan, New Zealand, and Israel), informational or data privacy is regulated according to broad principles stated in “omnibus” laws that are enforced more often by the state or national Data Protection Authorities than by courts. This makes for substantial differences in data protection rules and procedures, compared to the US and many other countries, and therefore in the expectations of consumers and vendors who meet each other in cyberspace.

It is useful to compare some of the legal privacy protections available in the United States and Europe, respectively, before examining how conflicts between them can be resolved.


US Constitutional protection against government, not private, intrusions. The legal protection of privacy in the United States (as in Canada, Australia, and most other countries outside Europe until quite recently) has traditionally focussed on guarantees against abuses of government power, not commercial practices. The US Constitution, drafted in the 18 Century, does not use the term “privacy,” but the First, Fourth, and Fifth Amendments protect individuals from unwarranted intrusions by the federal government in their personal lives. Prominently, this includes the Fourth Amendment right of individuals to be “secure in their persons, houses, papers, and effects” against unreasonable searches and seizures. Similar guarantees appear in the various state constitutions, and the Fourteenth Amendment to the federal Constitution has the effect of compelling federal and state courts to apply two centuries of Fourth Amendment jurisprudence to control the actions of state governments and of private parties acting “under color of” state law.

Apart from the latter, however, the federal Constitution (like all of the state constitutions except California’s, following a ballot initiative in the 1970s) offers no constitutional protection against invasion of privacy by private parties, or against unauthorized disclosures of personal information about them. The Supreme Court expressly held in United States v. Miller, 425 U.S. 435 (1976), that individuals have no protected Fourth Amendment privacy interest in personal information that they voluntarily provide to another private party (a bank, in that case). More recently, the federal Fourth Circuit Court of Appeals overturned Congress’ attempt to restrict the states in selling driver’s license information to commercial entities. See Condon v. Reno, 155 F.3d 453 (4thth Cir. 1998). The court ruled that the federal Driver’s Privacy Protection Act violates the Tenth Amendment (reserving powers to the states) and cannot be justified under the Fourteenth Amendment, because there is no constitutional right of privacy with respect to “names, addresses and phone numbers.”


US legislative focus on government uses of personal data. Most federal and state legislation in the US aimed at information or data privacy similarly concerns the use of personal data collected by the government (e.g., for voting, tax, benefits, or law enforcement purposes). The federal Privacy Act of 1974, for example, seeks to control data matching from government databases. It reflects academic and information technology industry views of “fair information practices,” such as restricting disclosure to employees with a need to know and furnishing the individual data subjects with notice and access rights. But it concerns only the use of federal government records. Similar legislation exists at state level, as well as more detailed provisions applicable to particular kinds of government records, such as tax filings. In addition, statutory privacy exceptions to federal and state Freedom of Information Acts preserve the confidentiality of some personal information that governments would otherwise be obliged to make public.
Consumer privacy and tort law in the US. There is little caselaw on the protection of consumer privacy in the common law jurisprudence of the American states. In limited circumstances, the common law tort of “invasion of privacy” can be actionable in state courts (see ALI Restatement (Second) of Torts § 652)), but this tort has not generally been interpreted to constrain the use of truthful information that was lawfully obtained. For example, in Dwyer v. American Express, 652 NE2d 1351 (Ill. App. 1995), the Illinois Court of Appeals addressed claims of tortious invasion of privacy arising from the alleged sale, for marketing purposes, of card member names and addresses associated with certain buying patterns. The court dismissed the claims, noting that the customers voluntarily used their charge cards and that the company owned the resulting transactional data.
Constitutionally protected commercial “speech.” Moreover, in the United States the use of information by private parties is normally protected under the First Amendment guarantee of freedom of expression. As the US Supreme Court held, for example, in a 1989 newspaper case, Florida Star v. B.J.F., 491 US 524, 541 (1989), the public disclosure of lawfully obtained and truthful information must be allowed unless restraint is reasonably necessary to satisfy a state interest “of the highest order.”

Restrictions on less-protected “commercial speech” must also be justified by important public policy interests and limited to minimal constraints. Consumer or data privacy will not necessarily be recognized as such a compelling interest, as is demonstrated in the 1999 decision of the federal Tenth Circuit Court of Appeals invalidating the privacy regulations issued by the Federal Communications Commission to control the use of “customer proprietary network information” (“CPNI”). U.S. West, Inc. v. Federal Communications Commission, No. 98-9518 (10th Cir., Aug. 18, 1999). “CPNI” refers to information about a customer’s telecommunications equipment, telecom services, and patterns of network use obtained in the course of providing or billing for common carrier telecommunications services. The FCC’s rules were grounded on section 222(c) of the Communications Act of 1934, as amended by the Telecommunications Act of 1996, under the heading “Privacy of customer information.” The rules established, in most cases, a written opt-in requirement before CPNI could be disclosed to others (including the carrier’s affiliates) for marketing or for other purposes apart from what is required to provide and bill for the subscribed service.

The appellate court found that the FCC’s rules unconstitutionally infringed the carriers’ First Amendment right to engage in commercial speech with customers. The court questioned whether the government had adequately identified what privacy interests were at stake and why they required extraordinary protection, noting that privacy is a broad concept and entails social costs as well as benefits. The court ruled that the government failed to demonstrate “real” harm to privacy and that, in any event, the CPNI regulations were not narrowly tailored to the perceived privacy interests. Those might have been as well served, the court concluded, with less restrictive alternatives such as providing notice and an opportunity to opt out.2 Unless this decision is overturned by the United States Supreme Court, the FCC will be obliged to make a fuller record to establish what privacy interests its rules are designed to protect and to demonstrate that the agency has weighed the alternatives to find the least restrictive means of carrying out its objective.

Thus, even if there were a political consensus in the United States in favor of a European-style omnibus law regulating the collection and use of consumer information (and no such consensus has emerged at the federal level or in any of the states), such legislation could be challenged constitutionally as an overly broad constraint on commercial speech.


Specific US privacy or data protection-related laws. Federal and state legislatures have adopted a variety of more narrowly tailored laws designed, among other things, to prevent specific harmful or offensive commercial uses of personal data. These were written for the offline world but typically apply as well to online activities. Those with the greatest relevance for electronic commerce are listed below:


  • The federal Fair Credit Reporting Act regulates the collection and dissemination of “consumer reports” (statements that include information about a consumer’s credit worthiness, credit history, character, reputation, personal characteristics, or mode of living). A consumer reporting agency (any body that regularly assembles or evaluates information on consumers to furnish to third parties) may provide information to third parties only for a “permissible purpose.” These purposes include, for example, credit and loan transactions, check approvals, and decisions to issue insurance policies.

Marketing is not a permissible use of a consumer report. However, under the most recent amendments to the FCRA, a company can disclose its intention to share its own transactional experience with an affiliate for marketing or other purposes, and it may do so if the consumer does not object.

The FCRA also provides a number of privacy protections that are similar to those found in European data protection laws. Persons relying at least partly on a consumer report must notify the individual, who has rights under the FCRA to review the data supplied and seek corrections or lodge objections in the file. These must be notified to the parties relying on the consumer report and transmitted in the future as part of the report. An individual may at any time obtain a copy of his or her consumer report from a consumer reporting agency. As there are three nationwide agencies in the United States, this is readily accomplished. The agencies must keep the data accurate and delete “obsolete” information (maximum retention periods are prescribed in the statute). Information relating to medical conditions or treatment may be disclosed to third parties only with the individual’s consent. Consumer reports may also be given to an employer or prospective employer only with the individual’s consent.

The FCRA is enforced by the Federal Trade Commission (or by the federal bank supervisory authorities, in the case of financial institutions) as well as through private judicial actions for damages (which are sometimes initiated as broad class actions representing many consumers). State attorneys general may also bring actions based on the FCRA, as Minnesota did this year with regard to a bank’s alleged sale of account information to a marketing company. Most states also have analogous credit reporting statutes enforced by private actions in the courts and by the state attorney general. Thus, any enterprise that regularly collects consumer information online and furnishes it to others must take care to satisfy any applicable obligations under the FCRA and pertinent state credit reporting laws.

Children’s privacy. The federal Children's Online Privacy Protection Act of 1998 ("COPPA"), specifically addresses information privacy practices online, with respect to children. Under COPPA, the Federal Trade Commission is promulgating rules to prevent the online collection and use of information from persons under 13 years of age, and disclosure to third



parties, unless there is sufficient notice and the consent of a parent or legal guardian. COPPA also requires web site operators to give parents access to the data furnished by the child and an opportunity to delete such data.


  • Bank privacy laws in several states regulate the disclosure of customer financial records to any third party (see, e.g., Ill. Rev. Stat. ch. 202, sec. 5/48.1; N.J. Stat. Ann. Sec. 17:16K-3; Minn. Stat. sec. 13A.01). These laws have direct application to Internet banking activities, although there are some unresolved questions about their jurisdictional reach in the case of financial services offered via the Internet from another state.3




  • The federal Electronic Funds Transfer Act, 15 USC §§ 1693 et seq. requires institutions that provide electronic banking and wire transfers to inform consumers of the circumstances in which information about their accounts or transactions will be disclosed to third parties in the course of effecting funds transfers.




  • Financial services reform. Congress has been debating substantial changes to the federal bank regulatory regime, which limits the activities of various kinds of financial institutions and often keeps them out of related financial services businesses. One result of financial services reform could be the freedom to create financial conglomerates offering retail banking, insurance, and stock brokerage services. The component parts of such conglomerates would want to share customer data with each other, not only for marketing purposes but, e.g., for better customer service, investment advice, consolidated account and transactions statements, credit evaluation, risk analysis, and fraud prevention. Many of these services are already being offered online, but typically by separate entities. Tracking some of the data protection undertakings included in the Federal Reserve Board’s 1998 approval of the Citibank/Travelers Insurance merger, the Clinton Administration and some legislators sought privacy provisions to the effect that customers must be informed of information sharing practices with affiliates as well as with third parties, and given an opportunity to opt-out of such sharing. The various data protection proposals for financial services have been vigorously debated, and the precise contours of notice, opt-in or opt-out procedures, and appropriate exceptions are much disputed. If data protection standards are included in financial services reform legislation, they will presumably apply as well to the rapidly growing online financial services industry, enforceable in the courts as well as by the banking, securities, and possibly insurance supervisory authorities.




  • Federal bank supervisory authorities. The Office of Thrift Supervision issued a Policy Statement on Privacy and Accuracy of Personal Customer Information (November 1998). OTS, which regulates savings and loan associations, offered consumer privacy “guidelines” in the interest of protecting the “safety and soundness” of the institutions, which could be threatened by legal and reputational risks in the event of data protection breaches. The Office of the Controller of the Currency has also begun to examine financial institutions’ data protection policies under a similar rationale.




  • Telecommunications laws and regulations. The federal Telecommunications Act of 1996 includes provisions concerning the privacy of data about a customer’s use of telecommunications services. It also includes provisions on subscriber lists. The statute would requires the customer’s approval, for example, before a telecom carrier could provide information to an affiliated or third-party Internet Service Provider, but so far ISPs themselves have not been treated as telecommunications service providers subject to the Act’s customer data privacy provisions. In addition, the FCC and state telecommunications regulators have adopted rules allowing individuals to have unpublished telephone numbers and to suppress the transmission of caller identification signals to the party called (which must be done on a call-by-call basis). Some states have adopted laws or regulations forbidding telemarketers from suppressing their caller identification signals; the application of these laws to telemarketers calling from other states is a jurisdictional issue that has not yet been decided.




  • The Cable Communications Policy Act of 1984, 47 U.S.C. §§ 551 et seq., as amended by The Cable Television Consumer Protection and Competition Act of 1992, addresses the privacy of cable television viewer choices. The Act requires cable operators to disclose their practices in collecting and using personally identifiable information (such as which optional channels a subscriber takes, and which pay-per-view films a subscriber orders). It prohibits the sharing of such information without prior consent. The Act also provides consumers with the right to access cable company records for purposes of inspection and error correction. The statutory provisions are enforceable through private rights of action for damages.


US Privacy-related restrictions on direct commercial contacts. Privacy concerns are also behind efforts to place restrictions on the time and manner of direct commercial communications to consumers.

  • Mailing lists. At least one state requires opt-in consent by consumers before selling or sharing mailing lists or credit card account information 4, and others require notification and opt-out procedures in such cases 5. Such legislation presumably applies to postal address lists (and perhaps e-mail lists) developed from online screen registrations, as well as credit card information from online transactions. It is not uncommon for vendors and direct marketing organizations to exclude Maine residents from their direct mail campaigns because of the requirement for opt-in consent to use mailing lists; it is not so easy to restrict access to website advertising, however.




  • The Telephone Consumer Protection Act of 1991, 47 USC § 227 obliges telemarketers to make and consult lists of persons who do not wish to be called and forbids advertising by fax without consent. As with certain restrictions in the Fair Debt Collection Practices Act, communications are regulated that could foreseeably result in disrupting the consumer’s family life or employment relationship, or tying up the consumer’s telephone line or fax equipment. So far, there are no reported court decisions applying the Act to e-mail advertising.




  • The Telemarketing and Consumer Fraud and Abuse Prevention Act of 1991 (15 USC §§ 6101 et seq.) also creates a private cause of action in the courts and authorizes state enforcement actions in the case of deceptive or abusive telemarketing practices. Again, the courts have not applied this Act to e-mail advertising.




  • Anti-spam legislation. Congress or the states could similarly choose to restrict e-mail advertising, and many bills have been introduced at both state and federal level that have not been adopted. Several states (including Colorado, Delaware, Illinois, Nevada, North Carolina, Texas, Virginia, and Washington) have recently adopted “anti-spam” rules. These typically prohibit deceptively identified commercial e-mails, and some require either a prior relationship with the consumer or consent from the consumer (or, in some states, from the Internet service provider) before launching an e-mail advertising campaign.


FTC and State Enforcement of "Fair Trading Practices" Laws. The Federal Trade Commission has taken action against website operators for violations of their own announced privacy policies, on the theory that this constitutes an "unfair or deceptive" trade practice under the Federal Trade Commission Act, sec. 5(a). The most celebrated case, against GeoCities,

concluded with a consent order in 1998 (see http://www.ftc.gov/os/1998/9808/geo-ord.htm). Most states have parallel legislation ("Little FTC Acts"), and consumer organizations in California, for example, have recently sued banks under California's law for sharing

customer information with marketing companies.
European-style privacy regulation. The European approach, which is influential in many other countries, is to define privacy as a civil liberty and to establish rules of broad application for the handling of personal data. Thus, both law and practice differ between the US and Europe, and there are ongoing discussions between the US Department of Commerce and the European Commission in an effort simply to find ways for customer and employee data to continue to move across the Atlantic in the context of international business. A brief discussion of European data protection law will highlight the potential for conflict and the need to establish either clear and workable agreements on jurisdiction or effective alternatives to administrative and court enforcement of European privacy laws when the data are processed abroad in the course of international business.

Privacy as a human right. In the aftermath of the Second World War and the defeat of totalitarian regimes in Europe and East Asia, sweeping declarations of human rights were adopted by the United Nations and the Council of Europe. Interestingly, those rights were not limited by their terms to protections against government intrusion. The 1948 United Nations Universal Declaration of Human Rights proclaims,

“No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks on his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.” (Art. 12)


The 1950 Council of Europe Convention on Human Rights similarly declares, “Everyone has the right to respect for his private and family life, his home and his correspondence” (Art. 8(1)).



In Western Europe, these declarations were taken to mean that governments had an obligation to protect their citizens’ privacy from intrusions by others as well as by the government itself.
The concept and scope of data protection. As noted above, the United States developed notions of “fair information practices” (including accuracy, notice, choice, access, and security) with respect to government databases, notably in the Privacy Act of 1974 and the privacy exceptions of the Freedom of Information Act of 1974, and also with respect to specific commercial activities, prominently credit reports. In the same period, several jurisdictions in Western European (starting with the German state of Hesse and the national legislatures in Sweden and France in the 1970s) adopted similar principles in data protection laws that applied to the private sector as well as to some government activities. The French law extended to any organized data files, manual or automated, but the trend in Europe was to deal more narrowly with the “new” problem of regulating centralized computer databases and data matching techniques. This was perceived to expose personal data to systematic searches, greater sharing of data, and new uses for the data beyond what had been disclosed to the individual.
The OECD Privacy Guidelines. By the late 1970s, there were public-sector privacy laws in the US and omnibus privacy laws in several European countries, drawing on roughly the same fair information practices principles, but there was already concern over the divergence of national laws implementing those principles. After much study, the member nations of the Organisation for Economic Cooperation and Development (OECD), including the United States, Japan, and Canada as well as the leading European national economies, adopted the 1980 OECD Recommendation Concerning and Guidelines Governing the Protection of Privacy and Transborder Flows of Personal Data (the “OECD Privacy Guidelines”). According to the Explanatory Memorandum, the member countries feared that “disparities in legislation may create obstacles to the free flow of information between countries.”
Substantively, the OECD Privacy Guidelines recognize certain “basic principles” (Arts. 7- 14) as “minimum standards” (Art. 6) for handling personal data, “whether in the public or private sectors” (Art. 2):


  • Collection limitation (collection by lawful and fair means and, “where appropriate,” with the knowledge or consent of the individual data subject;




  • Data quality (relevance and accuracy for the intended purpose);




  • Purpose specification (use should be limited to the purposes specified at the time of collection, and new or changed purposes should be specified subsequently);




  • Use limitation (use or disclosure only with consent, or under authority of law, or in a manner not incompatible with the specified purpose);




  • Security safeguards (reasonable precautions against loss, alteration, or unauthorized access);







  • Individual participation (data subject’s right to confirm existence of files, to have the data communicated to him, to be given reasons for any denial of access and an opportunity to challenge that denial, and to “challenge” data relating to him and, “if the challenge is successful” (no standard is given for this determination), to have the data corrected or erased;




  • Accountability principle (data controllers should be accountable for complying with “measures” that give effect to these principles).

The OECD Privacy Guidelines include a section (Arts. 15-18) on the obligation of member nations to take steps to ensure the free flow of personal information among them. Exceptions may be made where the importing country “does not yet substantially observe these Guidelines” or match specific data protections in the exporting country, or where the “re-export” of the data threatens to “circumvent” domestic data protection legislation. Article 19 says that member countries should give effect to the Guidelines by adopting appropriate domestic legislation, encouraging self-regulation, providing adequate procedures and remedies for individuals, and ensuring there is no unfair discrimination against data subjects.


The OECD Privacy Guidelines, although they do not have the force of an international treaty, have been influential in national legislation and self-regulatory initiatives. They are also the foundation for the US Administration’s and the FTC’s data protection objectives, as well as for the current US-EU “Safe Harbor” negotiations.
Council of Europe Convention 108. In 1981 the Member States of the Council of Europe (which now comprises nearly all European nations, inside and outside the separate treaty organization of the European Union) adopted a Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data (Council of Europe Convention No. 108). The Convention entered into effect in late 1985. This gave treaty force to essentially the same principles as were found in the 1980 OECD Privacy Guidelines, and also established an ongoing body to make further studies and recommendations concerning specific data processing applications.
The data protection principles were reorganized and slightly expanded in Convention 108, including, for example, a provision that data should be kept in personally identifiable form for no longer than required for the specified purpose of the data collection. The Convention also recognized “special categories” of data revealing race, political opinions, religious beliefs, health or sexual life, or criminal convictions; processing such data was to be subject to “appropriate safeguards” in domestic law. Member countries were also to designate an authority to liaise with data protection authorities in other countries and handle requests for assistance by individuals.
Council of Europe Convention 108 became the basis for data protection laws enacted in the late 1980s and 1990s in most European countries. Although the Convention contains provisions similar to those found in the OECD Privacy Guidelines on allowing the free flow of data across borders, some countries imposed authorization requirements and special restrictions even for data flowing to another nation that had enacted laws based on the Convention. France, for example, routinely imposed restrictions on data flows destined for England, because of differences in the scope and detail of their respective laws, as well as imposing conditions on data transfers to countries such as Italy and the United States that did not have similar laws.
The EU Directive. Because of continuing differences among the EU Member States, the EU institutions determined to effect a greater harmonization of data protection laws. These would include measures to prevent the laws from being bypassed by onward transfer of the data to countries lacking “adequate” levels of data protection. Directive 95/46/EC, on the protection of individuals with regard to the processing of personal data and on the free movement of such data (the “EU Data Protection Directive”). The Directive was adopted in 1995 and was to be implemented by the Member States by October 1998; some are still in the legislative process of transposing the Directive.

The Directive is largely built on the model of Council of Europe Convention 108, including its principles and terminology. But the Directive has much more substantive and procedural detail, as well as mechanisms for establishing European-level data protection policies and decisions, particularly with regard to transborder data flows to non-EU countries.


Broad scope of the EU Directive. The scope of the 1995 Directive is greater than virtually all of the national laws it supersedes. It covers the “processing” (defined to include anything that can be done with data, from collection to deletion) of “personal data” (any information identified or identifiable to a natural person), either by automated means or as part of an organized filing system (although a ten-year transition period is allowed for manual files, compared to three years for automated data processing). Processing for purely personal or household purposes is excluded, as are government activities (such as the police and the military) that are outside the scope of the EU Treaty itself. Essentially all processing of personal data for commercial purposes is, therefore, to be covered by the national laws implementing the Directive. It is difficult to imagine many activities of an enterprise relating to its customers or employees that does not entail some identifiable reference to individuals.
Grounds for processing under the Directive. Under Article 6 of the Directive, data controllers (those persons or entities that determine the purposes and means of processing, alone or jointly with others) are responsible for implementing the “data quality” principles drawn from Council of Europe Convention 108. According to Article 7, processing is only permitted if the individual (“data subject”) has given “unambiguous” consent or if one of the other enumerated conditions is satisfied. These include processing “necessary for the performance of a contract” with the data subject, or in order to enter into a contract with the data subject, and processing necessary for compliance with a legal obligation. Article 7(f) provides that processing may also be allowed under a balancing test – if the individual’s privacy interests do not override the “legitimate interests” of the controller. This is a potentially large, but untested, ground for lawful processing of personal data.
Sensitive data; balancing freedom of expression. Processing of “special categories” of sensitive data (race, religion, political opinion, trade union membership, health or sex life) normally requires consent, but there are a few exceptions. (Art. 8.) Processing of data relating to criminal offences security measures, or civil or administrative proceedings may be specially controlled under national law. Member States are to provide appropriate exemptions for journalism and literary expression, but only to the degree necessary to reconcile the right of privacy with the freedom of expression. (Art. 9.)
Expanded individual rights. The Directive is more detailed than the Council of Europe Convention in listing what information must be disclosed to the individual (Arts. 10 and 11). It repeats much of the language of the Convention on the individual’s rights of access to “the data” about her (Art. 12). The Directive includes wholly new provisions giving individuals a right to object to direct marketing (Art. 14(b)) and restricting the ability of controllers to employ screening or scoring software to reach “automated individual decisions” (Art. 15).
Notification under the Directive. Data controllers are responsible for notifying the authorities of their data processing activities (Art. 18). Alternatively, Member States may provide (as do Germany and Sweden) for a company to appoint its own internal data protection officer, who is then responsible for keeping a registry of the organization’s personal data processing practices and making it publicly available (Art. 18(2)).
European data protection authorities and remedies. The Directive stipulates that Member States must give independent data protection authorities investigative powers and the authority to order the blocking of data processing or data transfers (Art. 28). The Member States must also provide for judicial remedies, including injunctive relief, compensatory damages, and “suitable” sanctions to ensure compliance (Arts. 22-24).
European-level procedures. In the past, the interpretation and enforcement of data protection laws have been effected only at national level (or in the case of federal Germany, at state and national level). The Directive establishes EU-level procedures, including procedures for adopting additional measures that are binding on all Member States as a treaty obligation. A Working Party on the Protection of Individuals with regard to the Processing of Personal Data is established under Article 29 (the “Article 29 Working Party”), composed of representatives of the independent data protection authorities in each of the fifteen Member States and a representative of the European Commission. In addition, Article 31 contemplates a Committee (the Article 31 Committee) directly representing the Member State governments, which can be consulted as needed.

The Article 29 Working Party has actually been meeting quarterly since 1996 and producing papers and recommendations in several ongoing projects. This has been an important forum for communication and debate among the national data protection authorities, and some of the papers produced to date provide greater clarity as to how the rather general terms of the Directive will likely be enforced with respect to particular data applications. (The Working Party’s papers published to date are posted online at http://europa.eu.int/comm/dg15/en/media/dataprot/wpdocs/index.htm.) The Working Party’s opinions are advisory rather than mandatory, although they have a special role in the evaluation of third-country data protection, as explained below.


The Directive contemplates that there will be further detailed harmonization of Member State laws and practices over time. To effect such harmonization, the European Commission, on its own or at the recommendation of the Article 29 Working Party, can submit draft mandatory measures to the Article 31 Committee. The Committee (by a weighted majority vote reflecting the relative populations of the various Member States) may approve the Commission’s draft measures, in which case they are binding on all the Member States. If the Committee disapproves of the Commission’s draft measures, the European Council (representing the Member State governments at the highest level) may reject them, again by qualified majority vote, within three months. Otherwise, the measures go into effect and are binding on the Member States.
Transborder data transfers from the EU. Although data should ultimately move freely within the EU as a consequence of the Directive, Articles 25 and 26 address the question of data transfers to “third countries” outside the EU. Under Article 25, Member States are obliged to provide that data are exported only to third countries that ensure “an adequate level of protection.” Adequacy is to be assessed in the light of all the circumstances, which may take account of “professional rules and security measures” followed in the destination country, as well as its laws. Where one or more Member States question the adequacy of protection in a third country, the Article 29 Working Party is to give the Commission an opinion on that issue (Art. 30(b)). The Commission may, following the Article 31 process outlined above, order the Member States to prevent data flows to that country. It may also enter into negotiations with a third country, as it has done with the United States, to find ways to ensure adequate protection there for personal data from the EU. In that event, the Commission can also adopt a decision that is binding on the Member States through the Article 31 process (Art. 25(5) and (6)). That is the process that may be employed to give effect to any “safe harbor” data protection principles and procedures that are ultimately agreed between the European Commission and the US Department of Commerce, and subsequently with any other third countries.

Failing a determination of adequacy or agreed data protection provisions in a third country, some personal data may still be transferred under one or more of the “derogations” outlines in Article 26 of the Directive. For commercial purposes, the most important derogations are for transfers with the “unambiguous” consent of the data subject (Art. 26(1)(1)), transfers necessary to enter into or perform a contract with the data subject (Art. 26(1)(2)), and transfers subject to sufficient contractual or other guarantees of protection (Art. 26(2)). A Member State must notify the Commission and the other Member States to the extent that it approves individual data flows subject to contractual or other guarantees, and the Commission can adopt binding measures through the Article 31 process that might compel the Member State to block or change its conditions for such transfers (see Art. 26(3)). The Commission may also approve “standard contractual clauses” under Art. 26(4), although it has not yet done so.

So far, the Article 29 Working Party and the Commission have not given unqualified approval to data transfers to many third countries, although nations with very similar laws and enforcement practices are likely to get favorable treatment. Several jurisdictions outside Europe, including Quebec, the Hong Kong Special Autonomous Region, New Zealand, and Taiwan, have European-style data privacy laws and enforcement bodies and hope to be deemed “adequate” for continuing data flows from EU countries. Other countries, including Japan, Canada, and Australia, are debating national laws designed to satisfy the adequacy test as well as domestic political agendas. Still others, such as Argentina and Brazil, have not yet adopted constitutionally mandated privacy laws and are now considering such legislation in light of the EU Directive and the future enforcement policy of EU Member States.
Data protection in telecommunications. Also effective October 1998, EU Directive 97/66/EC applies the principles of the general Data Protection Directive to the telecommunications sector. As with some of the American legislation discussed above, this Directive lays down rules for subscriber directories, caller identification, telemarketing, and the disclosure of usage or billing data by telecommunications service providers. This has implications particularly for those telecom companies that also act as Internet Service Providers, as they must, for example, allow their subscribers to opt out of cross-marketing uses of the data. The Member States are just beginning to implement this Directive, however, and it was not drafted with the Internet in mind.

Significantly, in August 1999 the EU proposed to the World Trade Organization that Internet access and Internet network services should be classified as telecommunications services under the WTO’s Agreement on Basic Telecommunications Services. If the EU follows this classification internally, it might well seek to extend the requirements of the Telecommunications Privacy Directive to ISPs operating in Europe, and not only to the providers of public telecommunications services. This could affect ISP practices with regard to subscriber directories, anonymity, and marketing, as well as obliging online marketers to create or consult no-contact lists (which the European Commission has separately proposed). The application of such rules to ISPs, Internet backbone networks, or online marketers located outside the EU would certainly raise controversy. And as noted below, this jurisdictional issue is not being addressed in connection with the US-EU privacy “Safe Harbor” negotiations.


Contractual safeguards. Several Member States, even under pre-Directive data privacy laws, have long permitted data flows from individual companies (or among business networks such as travel reservation systems and credit card associations) based on contractual guarantees. France pioneered this approach in the 1980s among the national affiliates of Fiat, with respect to their employee data. A joint project of the Council of Europe, the European Commission, and the International Chamber of Commerce produced model contract clauses in the early 1990s that have been used routinely in Switzerland and that have served as a model elsewhere. In 1995, the German state data protection authorities agreed to allow Citibank to process German credit card application and transactions data in the United States, subject to a detailed agreement between the relevant Citicorp affiliates in both countries. This “interterritorial” agreement essentially imposed German standards (and potentially audits by the German banking or data protection authorities) on the US processing operations.

Typically, contractual guarantees entail such an agreement (which in some countries must be approved in advance by the data protection authority) between the data “exporting” party in Europe and the data “importing” party in the United States or some other third country. If the importing party fails to comply with its obligations under the agreement, the data protection authority can still take action against the exporting party in Europe and order it to suspend transfers. Under the law of some countries in Europe, the affected individuals would also have third-party beneficiary rights as a result of the contract and could sue either the data exporter or the data importer. European data protection authorities resist American criticism of their efforts to “export” European privacy laws. They view contractual solutions as a means of “repatriating” jurisdiction (through contract law rather than public law) where they would otherwise have to prevent the export of the data itself to avoid widespread circumvention of European data data protection laws.

The contractual approach works fairly well for interaffiliate and outsourcing transactions and for large, regular data flows in established business networks. It is obviously harder to implement and monitor with respect to data flows among numerous affiliated companies and business partners connected via intranet, extranet, or private enterprise networks.

In addition to individual contracts, there are efforts underway to establish model contracts for transborder data flows that would be effective under Article 26 of the Directive. Prominent examples are the models developed by the International Chamber of Commerce, the Privacy and American Business project, the Confederation of British Industry, and the German banking association. However, further development and acceptance of such models is likely to be deferred until the EU and the United States conclude their discussions on the possibility of establishing “safe harbor” principles and procedures that could be approved by the Commission under Article 25(6). Those would have the benefit of greater uniformity and established EU-US consultative procedures in the event of threatened data stoppages. Nevertheless, for transfers to other countries or transfers to the US that are not covered by a Safe Harbor agreement, contractual commitments remain as an alternative to parallel legislation.


Safe Harbor.” The United States, as suggested above, is unlikely to adopt an omnibus, European-style data privacy law. Privacy laws, for constitutional as well as political reasons, are likely to continue to be focussed on specific activities and to be adopted variously at federal and state level. Some Member States have already deemed the US to provide inadequate data protection and have allowed transfers to the US only with consent or subject to contractual guarantees. There is an ongoing bilateral effort, however, to provide a “safe harbor” or agreed mechanism for American companies to certify compliance with “adequate” principles, subject to effective regulatory or self-regulatory enforcement, with the consequence that their data transfers would be presumed to be adequately protected (subject, of course, to complaints or investigations demonstrating a failure to comply with the Safe Harbor principles in actual practice).

The draft Safe Harbor principles, along with more detailed, explanatory “Frequently Asked Questions” (“FAQs”) and a draft letter establishing procedures for implementation throughout the EU and for consultation with the US Department of Commerce, are posted at www.ita.doc.gov. European perspectives on the proposals are found at http://europa.eu.int/comm/dg15/en/media/dataprot, and the Article 29 Working Party’s papers on Safe Harbor and on third-country transfers generally are found at http://europa.eu.int/comm/dg15/en/media/dataprot/wpdocs/index.htm. If the US and the EU succeed in agreeing on “adequate” principles and enforcement procedures for processing European personal data outside Europe, this will probably serve as the model for many other countries with dissimilar domestic data protection legislation.


Canadian data protection law

Apart from Quebec (with its data protection law on the French model), Canada has traditionally relied on self-regulation for commercial uses of personal information. However, partly spurred by the EU Directive, it is now considering comprehensive data protection legislation applicable to the private sector. (Federal and provincial laws already cover government databases.)

In the early 1990's the Canadian Standards Association (CSA) gathered representatives from the public sector, industry (including transportation, telecommunications, information technology, insurance, health and banking), consumer advocacy groups, unions, and other interest groups to develop a common code to protect personal information. The group developed the Model Code for the Protection of Personal Information, which is based on the OECD Guidelines. In 1996, the Standards Council of Canada put the CSA Code through a lengthy standard-setting process and adopted it as a national standard. Recently, the government proposed Bill C-54 to legislate data protection by the year 2000 in the federally regulated industries (and to mandate it within three years at the provincial level). It appears likely that the legislation will codify some version of the principles contained in the CSA standard, subject to investigation and enforcement by federal and provincial .

Australian data protection law

Australia enacted the Commonwealth Privacy Act 1998 with the primary goal of regulating the government’s use of personal information. The Act offers strict privacy safeguards that the federal government (Commonwealth) and Australian Capital Territory (ACT) government agencies must observe when collecting, storing, using and disclosing such information. The Act also provides individuals with access and correction rights for their own personal information. Similar laws apply in the states.

Amendments adopted in 1990 extended coverage to credit reporting, paving the way for legally binding rules for the handling of credit information by credit reporting agencies and credit providers. Specifically, the provisions:


  • limit information that can be retained, and the duration of storage;

  • limit access to credit providers for specified purposes only;

  • limit the use of credit reports to credit decisions and debt collection;

  • prohibit credit providers from disclosing credit worthiness information about an individual, with the following exceptions:

  • the individual has given consent for disclosure to another credit provider;

  • the disclosure is to a mortgage insurer; or

  • certain limited information is disclosed to a debt collector.

The Act created a national privacy commissioner with certain rulemaking and audit powers and the authority to process complaints. The commissioner issues a legally binding code of conduct for credit reporting and makes determinations (rulings) on the law. However, most complaints are resolved through negotiations and rarely require a formal “determination” by the commissioner. The commissioner is also vested with the authority to encourage corporations to develop privacy policies consistent with the OECD guidelines. In February 1998, the Commissioner exercised that authority and issued non-binding guidelines entitled, “National Principles for Fair Handling of Personal Information.” The principles are based on the OECD Guidelines and intended to form the basis of industry-developed voluntary privacy codes of conduct. Recently, the government has indicated that industry will be obliged to develop or accept such codes, or face further legislation.

The Australian Telecommunications Act 1997 includes rules on the use and disclosure of personal information held by telecom carriers, carriage service providers and others. The Australian Communications Authority is authorized to register voluntary industry codes. Though the codes are non-binding, the Authority can issue binding standards where industry fails to act. In addition, the privacy commissioner has oversight of recordkeeping in relation to the disclosure of personal information in this industry.
Data protection in Japan


  1. Traditional Approaches to Jurisdiction

Unlike securities and banking regulation, antitrust law, and certain other legal regimes, data protection-related laws seldom include explicit jurisdictional provisions, and there is little caselaw to date dealing with the application of jurisdictional doctrines to claims of privacy infringement. We briefly summarize traditional jurisdictional doctrine in the US and internationally, and then examine the challenge of applying it to laws designed to protect privacy.


US Jurisdictional Principles
In the federal system of the United States, it has been necessary to elaborate jurisdictional and conflict of laws doctrines to manage the frequent tensions between federal and state lawmaking and between the laws of one state and another. US constitutional limits on the exercise of state jurisdiction are discussed more fully in Section IA of this Report. Basic jurisdictional doctrines in the United States are briefly summarized below so that they can be applied to privacy issues.

Prescriptive jurisdiction. Prescriptive (“legislative”) jurisdiction is the authority to make law. When more than one state or country has prescribed law that could reasonably be applied to the conduct at issue, courts are required to engage in a conflicts of laws analysis and, even if they keep the case before them, choose the law to apply. This is typically based on an assessment of the jurisdiction with the most significant relationship to the parties and the conduct and with the weightiest state interests at stake, as discussed more fully in Section IA of this Report.



In the United States, the allocation of prescriptive authority between the states and the federal government is governed by the Tenth Amendment of the federal Constitution, which reserves legislative powers to the states except in those specific areas where (a) lawmaking is constitutionally the exclusive domain of the federal Congress or (b) the federal Congress has enacted legislation within its authority that must, in order to be effective, preempt contrary state legislation. Most federal commercial regulation is promulgated under the Interstate Commerce Clause of the Constitution, authorizing Congress to regulate the interstate and foreign commerce of the United States. There is also jurisprudence establishing a “dormant Interstate Commerce Clause” doctrine that precludes state legislation (even in areas where the federal Congress has not adopted pervasive legislation) where state laws would have the effect of unduly burdening the interstate or foreign commerce of the United States. The federal and state legislatures often delegate more detailed rulemaking authority to regulatory bodies, such as the Federal Trade Commission and state consumer protection agencies, but the scope of their prescriptive jurisdiction is still derived from the relevant constitutional or statutory provisions.

Adjudicatory (or “judicial”) jurisdiction, the authority to hear and rule on individual legal claims, is granted to courts or regulatory agencies. It consists of two elements. There must first be subject-matter jurisdiction, authority to adjudicate matters under the applicable statutory or common law. In the US today, subject-matter jurisdiction is typically based on proscribed conduct or effects occurring within the territory of the state (or of the federal district). The court or agency must also have personal jurisdiction over the parties, to compel their attendance and subject them to preliminary orders and the final judgment. Residents of a state (or of a federal district) are typically subject to personal jurisdiction there for any cause of action. Corporations are normally considered residents of the state and district in which they are incorporated. Non-resident individuals or corporations are also subject to personal jurisdiction, either specific or general, under the following circumstances:



  • Specific jurisdiction over a non-resident, with respect to a particular claim, requires only “minimum contacts” with the forum state, so long as that “does not offend ‘traditional notions of fair play and substantial justice,’” in the classic formulation of International Shoe Co. v. Washington, 326 US 310, 316 (1945). Minimum contacts may be found if the defendant (1) purposefully directed its relevant activities toward a state resident or “purposefully availed” itself of the privilege of conducting business in the state, (2) the claim arises from those activities, and (3) the exercise of jurisdiction is reasonable. (See, e.g., Core-Vent v. Nobel Industries AB, 11 F.3d 1482, 1485 (9 Cir. 1993). )




  • General jurisdiction, over any and all claims against a non-resident, regardless of their specific nexus with the forum state, requires the non-resident’s “systematic” and “continuous” contacts with the forum, such that the non-resident would anticipate defending any kind of legal claim there (see International Shoe, 326 US at 318).




  • Fairness factors. Assuming a non-resident, under the test for specific jurisdiction, has purposefully directed its relevant activities to the forum state or availed itself of the privilege of doing business there, a court must still assess the fundamental fairness of asserting jurisdiction over the non-resident. In World-Wide Volkswagen Corp. v. Woodson, 444 US 286, 292 (1980), the Supreme Court listed five factors to consider in this assessment (without indicating their relative weight):

1. The burden on the defendant of appearing in the forum;

2. The forum state’s interest in adjudicating the dispute;

3. The plaintiff’s interest in obtaining convenient and effective relief;



  1. The interests of the interstate judicial system in providing for the efficient resolution of controversies; and

  2. The shared interest of the states in furthering substantive social policies.


Enforcement jurisdiction, the authority to detain persons, seize assets, appoint a receiver or administrator, etc., in order to enforce a court judgment, normally applies wherever the court has adjudicatory jurisdiction. In addition, that court judgment must be enforced by other courts anywhere in the United States, with respect to persons or property found in their territory, under the Full Faith and Credit clause of the federal Constitution.
Application to privacy laws. As demonstrated above, there is a wide variety of privacy-related legislation in the United States. The federal acts normally specify the subject matter jurisdiction of federal (and sometimes state) courts and apply either uniformly throughout the nation or only to activities in specific interstate businesses. The FCRA, for example, applies nationally to defined consumer reporting activities, based on a congressional determination that credit reporting practices substantially affect interstate and foreign commerce. Enforcement is left to the Federal Trade Commission, which may bring actions in federal district courts in cases of noncompliance, and injured parties may also bring private actions in the federal courts. Notably, the states are free to adopt additional legislation governing consumer reporting conduct in their territories; these may provide supplemental but not contradictory protections. In other situations, federal legislation applies only to federally regulated entities, as in the case of the Electronic Funds Transfer Act, which is enforced by the federal bank supervisory authorities with respect to the institutions that they regulate or insure. In still other instances, such as the Telemarketing and Consumer Fraud and Abuse Prevention Act, the federal law creates a private right of action in both federal and state courts and also authorizes enforcement by federal agencies and authorizes suits brought on behalf of all affected residents by a state attorney general. In the case of state laws governing the use of personal information for commercial purposes, the laws are often silent as to their territorial scope. Nevertheless, state courts and agencies are constrained by the constitutional limits to their jurisdiction discussed above, as well as by any conditions or restrictions found in the specific legislation at issue or in the state’s “long-arm” statute on adjudicatory jurisdiction.




Share with your friends:
  1   2   3   4




The database is protected by copyright ©essaydocs.org 2020
send message

    Main page