пятница, 2 марта 2012 г.

THE CENTER FOR AMERICAN PROGRESS HOLDS A DISCUSSION ON "TRACKING TECHNOLOGY: BALANCING INNOVATION AND PRIVACY"

THE CENTER FOR AMERICAN PROGRESS HOLDS A DISCUSSION ON "TRACKING TECHNOLOGY: BALANCING INNOVATION AND PRIVACY"

JUNE 27, 2011

SPEAKERS: JULIE BRILL, COMMISSIONER, FEDERAL TRADE COMMISSION

ED FELTEN, CHIEF TECHNOLOGY OFFICER, FEDERAL TRADE COMMISSION JIM STEYER, FOUNDER AND CEO, COMMON SENSE MEDIA

PETER SWIRE, SENIOR FELLOW, CENTER FOR AMERICAN PROGRESS

CHRIS WOLF, PARTNER, HOGAN LOVELLS LLC

NEERA TANDEN, CHIEF OPERATING OFFICER, CENTER FOR AMERICAN PROGRESS

[*] (JOINED IN PROGRESS)

BRILL: ... a mere 20 pounds electric typewriter, an IBM Selectric no less, with a correcting ribbon, and my cool new turntable.

My son leaves for UMass-Amherst in a few weeks with a cell phone that has more computing power than was available to the entire Computer Science Department at Princeton in my day, and he wants an upgrade.

As President Clinton succinctly put it in his commencement address, 10-year-olds can find things on the Internet that I had to go to university to learn.

When I look at the brave new cyber world Tom Hanks and Bill Clinton captured so well, I can't help but think of the words of Shakespeare's Miranda: "Oh wonder, how many goodly creatures are there here?"

Because of innovations in the Internet, social media, mobile communications, and location-based apps, we can now become friends with people whose voices we've never heard. We can reconnect with folks we knew years ago but lost touch with. We can tweet our thoughts to a cyber cafe full of anyone who will listen. We shop for groceries online, go to the movies online, share photo albums online, pay traffic tickets online, and even date online.

Today we see aid workers delivering prenatal care and vaccinations to the farthest corners of the developing world using mobile phones and online data banks, and we watch as populist movements armed only with Twitter and the Internet bring down dictatorships.

Miranda had it right, oh wonder indeed.

But of course that's not the whole picture. Allow me to quote Tom Hanks one more time. "A sober look shows that just as the world has gotten to be a better place, after all, it has also grown a bit worse at the exact same rate. A one-step-up, one-step-back sort of cosmic balance between forward progress and cultural retreat."

Just as the Internet and other technological innovations are extending our reach to the limits of our imagination, those providing us with all this are reaching back, harvesting and trading in information about where we are, what we do, who we meet, and what we buy.

The amount of tracking of an individual's behavior online, what sites she visits, what adds she clicks on, what she says when she chats, and where she wanders through the day is unprecedented, and since it is largely undetected by the consumer it can become an encroachment on consumer privacy, the yin to all this wondrous cyber yang.

For two decades the FCC has monitored and worried about the price, in terms of privacy, that consumers are paying for access to our burgeoning cyberspace. We've worked to preserve consumer's control over their private data as early as the 1990s when we relied primarily on a notice and choice model counting on businesses to give consumers clear choices about how their data is used and counting on consumers to read and understand privacy policies before making their choices.

The theory is sound, but it has proven unworkable. It is not reasonable to expect consumers to read and understand privacy policies, most about as long and as clear as the code of Hammurabi, especially when all that stands between them and buying that flat new -- that new flat screen TV or downloading the latest version of angry birds is clicking the little box that says, "I consent."

The commission has also played defense, focusing on privacy violations that cause indisputable harms -- data breaches, identity theft, invasions of children's privacy, spam, spyware and the like.

But this approach falls short as well. It only addresses infringements on privacy after harm has been done, giving too little incentive to companies to design systems that will not do harm in the first place.

Also, by focusing only on tangible harms to consumers, this approach misses the less quantifiable but nonetheless real injuries suffered by those whose sensitive information -- about medical conditions, children, or sexual orientation -- is exposed.

Further, neither the notice and choice approach to privacy nor the harm-based model speaks to advances in technology that present ever more sophisticated opportunities to collect data, including the ability to gather information about consumers' every move from their smart phone, and ever more sophisticated opportunities to manipulate data, including the ability to take information that has been stripped of personal identification and re-associated with specific individuals.

Our new reality, this new reality, has lead the Federal Trade Commission staff to prepare a preliminary report proposing a new privacy framework called "Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Business And Policymakers."

The report makes three principal recommendations.

First, we call for companies to build privacy and security protections into new products, not just retrofit them after problems arise. When designing new products and services the level of security and privacy protection should be proportionate to the sensitivity of the data used. And companies should limit the amount of information collected to what is needed and retain it only as long as needed.

Second we call for privacy policies that consumers can understand without having to retain counsel. The reports suggest that one way to simplify notice is to exempt what we have called commonly accepted practices from the first layers of notices, practices like sharing data with a shipping company that will deliver the product that you just ordered. When these disclosures of obvious uses of data are culled from notices, the consumer can focus on more pertinent uses of her information.

And third, we call for companies to be more upfront with consumers about how they collect data, how they use it, and how long they keep it. Companies need to share with consumers the profiles they are compiling, especially if these profiles are informing decisions about loans, insurance, employment and other sensitive matters.

When taken as a whole, I believe the framework we have proposed is flexible enough to allow businesses and consumers to continue to profit from an innovating, growing and rich information marketplace, and sturdy enough to provide guideposts on how to continue to innovate, grow and enrich in a responsible manner.

Now, the commission's most talked about recommendation and the one perhaps most relevant to the issues that we're going to talk about today is the creation of a "Do Not Track" mechanism to allow consumers meaningful control over how their online behavioral information is used. A majority of the federal trade commissioners supports the creation of such mechanisms; myself included.

Our proposal is a technology-driven approach that will allow consumers to make persistent choices that travel with them through cyberspace, communicating their tracking preferences to every website they visit. It doesn't have to be all or nothing. Consumers can be given refined choices about what information is collected and how it is used, giving consumers more meaningful control over their personal information.

The commission believes there are five essential components to a "Do Not Track" mechanism.

First, it must be simple for consumers to find and use.

Second, it must be effective. Companies must honor the tracking choices consumers make or face enforcement actions.

Third, the "Do Not Track" mechanism must apply across companies and technologies. Consumers should not be expected to make tracking choices on a company-by-company basis. This raises the issue, also flagged in our report, of whether "Do Not Track" should be extended to the mobile environment. With so much information about consumers exchanged in that space, I believe the answer is yes.

This branch of the information superhighway is in desperate need of basic reform. A recent study by the Future of Privacy Forum found that out of the top 30 paid apps, 22 did not even have a basic privacy policy.

The fourth criteria for judging or determining whether a "Do Not Track" mechanism is effective is whether the "Do Not Track" mechanism will do more than just prevent the consumer from receiving targeted advertising. It must provide the consumer with an opportunity to stop the collection of information about her online behavior.

And fifth, the choices consumers make through "Do Not Track" should be persistent. That is, customers should not have to reset their preferences every time they clear their cookies or close their browsers.

Now, to its credit the industry working on developing "Do Not Track" mechanisms. Ed Felten, the FTC's chief technologist and noted computer science expert, is here to talk in great detail about the technology involved.

We brought Ed to the Federal Trade Commission from Princeton, by the way, where he did an admirable job helping my alma mater's Computer Science Department outstrip my son's phone.

We knew we needed this sort of high-powered technical expertise Ed and his team bring to the FTC if we were going to meaningfully engage industry in discussions about workable, effective "Do Not Track" mechanisms that can function in the traditional online environment as well as the mobile space.

Now, as Neera mentioned, I spent last week in Brussels speaking at a number of different conferences and workshops, including one that focused on "Do Not Track" and online tracking generally.

Two things stood out there. First, there is tremendous momentum internationally for "Do Not Track" mechanisms. And second, from a policy perspective, the European Commission is approaching the issue of behavioral advertising in much the same way we are.

Everyone recognizes that behavioral advertising helps support online content and services, and that many consumers value the personalization such ads provide. But we are also all concerned that much of the tracking underlying this advertising is invisible to consumers, who at present do not have real choices about how or if their personal information about their cyber behavior is collected and used.

I want to spend just a few minutes talking about online privacy and tracking related to children. While we have a responsibility to protect all consumers, that responsibility increases for children. The Federal Trade Commission enforces the Children's Online Privacy Protection Act, or COPPA, and its implementing rule. COPPA requires operators of certain websites and online services to provide protections in connection with children's information.

Interactive websites and online services directed to children under 13 and operators of general audience sites and services, having knowledge that they've collected information from children, must all comply with COPPA.

The commission recently announced its largest civil penalty in a COPPA case, a $3 million dollar settlement against Playdom, a leading developer of online multi-player games. We alleged that the company and one of its executives violated COPPA by illegally collecting and disclosing personal information from hundreds of thousands of children under age 13 without their parents' prior consent.

The COPPA rule went into effect in 2000. We began a review of the rule last year, five years before we had to, to ensure that the rule continues to work in today's new technological environment, especially with the rapid expansion of mobile communications.

Now, the review is ongoing, but the public comments we received and the roundtable discussions we held indicate that widespread -- that there is widespread consensus that COPPA and its implementing rule are written broadly enough to encompass most forms of mobile communications.

For example, technologies such as interactive mobile applications, games and social networking services that access the Internet are clearly online services covered by COPPA.

There seems to be less consensus, however, as to whether certain other mobile communications, such as text messages, are online services that come under the rule. We continue to look at this question pretty closely.

And while COPPA encompasses our responsibility to protect children's privacy online, it doesn't relieve us of the obligation to prepare children to become consumers who will make wise and responsible choices about their online behavior.

We're particularly proud of our educational booklet "Net Cetera: Chatting With Kids About Being Online," which provides practical tips on how parents, teachers and other trusted adults can help children of all ages, including teens and pre-teens, reduce the risk of inappropriate conduct, contact and content that come with living life online.

"Net Cetera" focuses on the importance of communicating with children about issues, ranging from cyber bullying to sexting, social networking, mobile phone use and online privacy.

Through our partnership with schools, community groups, and local law enforcement, the FTC has distributed more than 7.8 million print copies of the guide over the last couple of years. I'd like to end today with some thoughts from a Princeton graduate who knows a little bit about the opportunities and perils of cyberspace: Jeff Bezos, founder of Amazon.com and Princeton University's commencement speaker last year.

Back then he told the graduates, "Tomorrow, in a very real sense, your life, the life you author from scratch on your own, begins. When you are 80 years old and in a quiet moment of reflection, narrating for only yourself the most personal version of your life story, the telling that will be most compact and meaningful will be the series of choices you have made. In the end, we are our choices."

The FTC's work on privacy and on tracking is all about keeping that inspiring statement true. We want to build a rich online environment where individuals can make meaningful choices about how they present themselves to the world, and that can only come about when individuals control private information about who we talk to, what we say, where we go, and what we do in cyberspace, mobile space, and beyond.

Thanks very much.

(APPLAUSE)

TANDEN: And I think Commissioner Brill has time for a few questions, which I'll help moderate. If you could raise your hand and also just identify who you're with. Any questions?

QUESTION: In terms of protection (inaudible) consumers, you mentioned about FTC. And I just wonder if you will be able to (inaudible) this area, because a lot of (inaudible) spying surveillance by those maybe (inaudible) law enforcement, but most important is the contractors or financial institutions, or the contractors, or even phony persons or attorneys or (inaudible) attorneys (inaudible) or they're contractors again.

So would you be able to (inaudible) this area, make real connections and real force for (inaudible) and investigation, really bring them to the jail (inaudible) from local to federal, and then you can reduce their pension or any pension toward those retirement who are doing misconduct or unlawful act?

BRILL: Thanks for the question.

You know, it is definitely the case that in tough financial times which consumers have experienced over the past couple of years, the opportunity and incentive for scam artists grows, and that includes in -- with respect to informational scams, folks who are trying to gather information from consumers and use that information in inappropriate ways.

We have seen a growth in those kinds of scams over the past few years and it's something that we are very much targeting to try to ensure the consumer's information is protected from financial scams.

And just to touch on another aspect of what I think the question focused on, financial institutions are subject to the Gramm-Leach- Bliley Act and the underlying safeguards rule and some other provisions that particularly focus on financial institutions.

Similarly, credit reporting agencies, which hold a great deal of financial information, in some ways probably the most sensitive financial information that consumers can have are also subject to the Fair Credit Reporting Act, which does contain some very important protections for consumers.

TANDEN: Are there additional questions?

Go ahead.

QUESTION: Commissioner Brill, since you mentioned COPPA, I just wanted to ask you what will seem a very minor question, but I think is actually quite important to those of us who follow COPPA and worry about the potential for COPPA's expansion.

In the press release that went out about the Playdom settlement, there was a line in there from Chairman Leibowitz saying that those that operate sites that appeal to kids owe it to parents and kids to obtain parental consent.

And that's subtly different, of course, from the actual standard in COPPA, which is that the requirements of COPPA apply, as you said, either in cases of actual knowledge or where a site is directed at kids.

And so some of us have worried that this may signal some sort of a shift in the FTC's understanding of what "directed at" means, because, of course, "directed at" has to do with how I send the site out, whereas "appeal to" might -- might be understood to apply from the other side.

It might be understood to suggest that, if a certain number of kids found a site appealing, even if the operator hadn't directed it to them, that it could be covered by COPPA and that, in turn, this could require age verification mandates for large numbers of sites that are now currently covered by COPPA and run into the same sort of first amendment problems that (inaudible) raised.

So I wonder if you could just comment on that and explain that, or allay any concerns we might have?

BRILL: Don't be concerned. There's no shift.

(LAUGHTER)

TANDEN: That's easy.

BRILL: That was easy.

(LAUGHTER)

Really, I mean, I could go on, but just to make it quick and simple, I think, you know, press releases are written in a way to try to attract attention and really help the average consumer understand what's going on, but we are not, through the Playdom case, intending at all to shift what the statute requires. And you've laid it out quite accurately what the statute requires.

TANDEN: I think we have one or two more -- time for one or two more questions. Right there?

QUESTION: Hi. I'm with (inaudible), which is a Web and mobile platform that allows individuals to own and control and ultimately benefit from their personal data. And we as a company fully support and agree with the three principles in the new framework that you outlined.

My question is really about the second -- the second point that you outlined for "Do Not Track" about effectiveness. How would "Do Not Track" be enforced?

Is that something that the FTC would do or do you see other groups getting involved in that?

BRILL: That's a very helpful question. Thanks. You know "Do Not Track" and our proposal for "Do Not Track" is designed to encourage industry to develop "Do Not Track" mechanisms and also designed to inform policymakers such as Congress and others that are interested in this issue about how it can be appropriately structured.

Whether it turns out to be an industry self-regulation or an industry-driven technology approach or whether it ends up coming from a legislative approach enforcement is a very, very important component of it.

To the extent that it is self-regulatory, what it will require is an agreement by the vast bulk of industry, and that includes Web browsers, ad networks and websites to honor the information that they're receiving about consumers' choices.

Now, right now, I have a bit of concern about whether or not we will get to a place where all of industry can effectively honor those choices. So that is a big issue around the self-regulatory approach.

Of course, to the extent that anyone makes a promise that they will honor a choice that is made by consumers, the Federal Trade Commission can enforce that promise because they would be engaging in a potentially deceptive act if they didn't honor that promise.

Now, in a legislative approach, obviously that becomes very different and can be dealt with in some ways in an easier and more streamlined fashion. What we are very much looking to industry, to all of industry, ad networks, advertisers, and websites to indicate whether or not they will honor some of the technological advances that have been made in order to inform them about consumers' choices.

QUESTION: If the "Do Not Track" preferences apply across companies and technologies, that suggests it's not device-specific. My choice would apply to this device, this device, the one under the chair. How, or rather where would those preferences than be stored? BRILL: So we'll pretty soon reach the limits of my technological knowledge. And as I mentioned, Ed is here and I would strongly encourage you to ask him that question.

But my fundamental basic understanding, which is probably good enough for maybe nine-tenths of you in this room, is that it would be stored -- to the extent it's a browser-based mechanism, it would be stored at the browsers. To the extent that it is instead an icon- based mechanism, which is one that the digital advertising alliance is offering up or beginning to develop, it would be stored on those websites.

It might be that choices have to be made with each one of your devices; that is, your computer, your mobile phone, and your iPad; but once you make those choices we want them to be persistent with respect to that device. Did I get that right, Ed?

Thumbs-up?

FELTEN: I love when that happens.

(LAUGHTER)

TANDEN: Do we have any additional questions?

With that, thank you so much, Commissioner Brill.

BRILL: Thank you. Thanks so much.

(APPLAUSE)

TANDEN: And I'll invite the panel up.

Thank you so much. We're going to dive right in. I'm going to introduce the panel quickly and then let them have a little time for remarks, and we'll get to your questions in short order.

It's my honor to introduce Ed Felten. He's, as Commissioner Brill said, is the chief technologist at the Federal Trade Commission. He's on leave from Princeton University. His research interests include computer science and privacy, especially relating to media and consumer products and technology and law policy. At Princeton he was a professor of computer science and public affairs and the founding director of Princeton's Center for Information Technology Policy.

So we're glad to have you in the federal government, working on these issues. And, finally, he was elected to the American Academy of Arts and Sciences.

First I just wanted to broadly ask if you wanted to, sort of, elaborate at all on the framework that Commissioner Brill referenced and the kind of policies choices at the heart of that framework?

FELTEN: Sure. And I guess let me elaborate a little bit with respect to these tracking technologies and how a "Do Not Track" system would fit within that framework. Does that make sense? TANDEN: Great.

FELTEN: So basically what we're talking about at a nuts-and- bolts level is giving users, or in the case of younger children, parents, some better control and choice over the ability of sites or -- websites that the user visits or third parties to accumulate records of what users do over time and across different websites.

And, in the mobile setting, there is concern about tracking of location, of the user's physical geographic location over time.

And so the question is how to provide choice that's meaningful for users about that. And, really, one of the things that's become clear over the last few years is that -- is that there are lots of different technologies that a website or a third party might use to track a user across the Web, starting with cookies, flash cookies, HTML5 local storage, cache storage, browser fingerprinting, et cetera, et cetera.

So rather than trying to fight each one of these one by one and asking users to educate themselves about every one of these technologies and what to do about them, it makes sense to have a single choice mechanism that applies, as Commissioner Brill said, across all of the different technologies for tracking and across different companies that might try to track.

And there are a couple of technologies that have emerged and have been shipped in browsers to do that, which I could talk about some more if -- if you like.

TANDEN: Go ahead.

FELTEN: Do you like?

TANDEN: That would be great.

FELTEN: OK, great.

So there are two main tracking control technologies that are out there now. First let me talk about the tracking protection list mechanism that is in Microsoft's Internet Explorer 9 browser.

And the basic idea here is that a third party can make -- can make a list of websites that -- of sites that are engaged in tracking and -- and a user can subscribe to one of these lists. There's a multiplicity of different lists that exist. A user can subscribe to a tracking protection list.

And what that list does is it tells the browser to block access to certain websites or servers as a third party, not as a first party. If the user chooses to go to that place, then the browser will go there, but if the user goes to site A and site A includes content from site B and site B is on the list to block, then that interaction will be blocked.

This is a way of shutting off access to third-party tracking sites, regardless of what technology they're using. And the user has a choice of which tracking protection list, if any, they want to subscribe to.

So that's -- that's the first technology which Microsoft has shipped. The second anti-tracking technology or tracking control technology is a "Do Not Track" flag which is supported currently by Mozilla Firefox, both on their desktop and mobile browsers and is -- reportedly will be supported by Apple soon.

And this is a technology that the user can turn on which causes the browser to send to third parties an indication of the user's preference, saying this user has chosen not to be tracked. And then the websites can respond to that by refraining from tracking.

This requires the websites, in order to work, in order to actuate the user's preference, it would require the website to refrain from tracking.

So this -- this is supported by other browsers and it's a system that does require the websites to do certain things. In a system that -- in a situation where the user has access to both of these technologies, they might ask the site nicely not to track them, and if the site doesn't comply, they might then use the more emphatic tracking protection list mechanism to -- to shut that site off.

So these technologies exist. They're out here. Three of the four major browsers now support them. And we're now seeing a dialogue about -- about how to put the choices that users are expressing into effect.

TANDEN: Let me introduce Jim Steyer. He's the founder and CEO of Common Sense Media, the nation's leading nonpartisan organization dedicated to improve the media lives of kids and families. I'll just briefly go through his bio.

Previously he served as the president of ChildrenNow, a leading national advocacy organization for children which he founded in 1988. Jim has a career as an elementary school teacher, a civil rights lawyer. He's worked for the NAACP and served as a law clerk for the California Supreme Court. So we're very lucky to have him here.

And I wonder if, Jim, you could just elaborate on the concerns that parents have in particular and what steps they can take in this brave new world.

STEYER: OK. Well, at Common Sense we believe that privacy is an incredibly important issue for kids and families across the globe. We know this is a global issue. And I'm going to talk a little at the end about not just legislative and some of the stuff we're talking about here, but some of the other impacts that unfortunately none of us can rule upon, but would still have an incredible impact on kids.

And I think that anybody who's a parent out here knows that the impact today of digital media -- cells phones, the Web, et cetera -- on kids is extraordinarily different than it is for anybody sitting up here, totally different. And I have four kids. I'm a professor at Stanford, as well. So I talk to my Stanford kids about this all the time, too. And they look at it not just in the context of the Web, but very much in the mobile phone and really also on the social networking platforms, which much of the new stuff that's happening where we live out in the Bay area is being built upon sort of on a Facebook-related model.

We have a very straightforward set of principles at Common Sense about this stuff and it may not make everybody in the audience happy, particularly people who are sort of either, A, aren't parents, or, B, sort of take more of a libertarian approach to this issue. But we have sort of five basic principles and we're going to stick to them, we believe they matter for every kid and teen here in the United States and across the globe and they're basically as follows.

First, we should -- we believe there should be no tracking of kids, you should not track kids, and there should be no third party behavioral marketing to kids. And we think that it's unfair to and deceptive. And we think right now in general that the Web and mobile phone land (ph), we don't really distinguish between the two because my kids get an awful lot of their stuff on their -- on cell phones -- that essentially it's sort of a Wild, Wild West environment and that there needs to be clear protections for kids.

We do not take position on do not track overall, we focus on anybody under the age of 18, but we believe quite simply that there should be no behavioral marketing to kids. And you can agree or disagree with that, but we just look at it strictly through the lens of kids.

The second thing that we believe is the principle that there should be an eraser button, the industry should take some of the billions of dollars that they are making in profits and create an eraser button that allows all consumers, but specifically for kids and families, to delete any kind of online information that's out there, whether it's put up by themselves or other people.

So we actually believe that's a big part of what happens, because kids and adults make a lot of mistakes with what they post and we believe that technology should be developed today, or as quickly as possible, that allows basically an eraser button concept. So we believe pretty strongly in that as well.

In general we believe that opt in should be the standard, not opt out, and we think that what's happened thus far obviously has been extraordinary technological development with great benefits. We are big -- at Common Sense we are big believers in both media and technology, but we believe that opt in is generally a better standard then opt out because it gives the consumer choice as opposed to making it incumbent upon consumers, particularly kids, to have to figure out how to get out of an existing system.

So we'd would like to see opt in as the basic standard and certainly -- and the default standard, and certainly for kids and teens . The fourth thing that we really believe is that privacy policies need to be clear and transparent and easy to understand, and right now they're not. So we believe there needs to be a big push on that, that involves obviously a lot of work by industry to simplify this and make it easy for the average consumer to take -- to understand what's said.

But we believe that that's a big -- a big part of the solution going forward. It is just simple, clear, transparent privacy policies that everybody understands. We understand that that won't happen overnight, but we think it needs to be a big focus of both legislative and regulatory efforts, but more importantly, industry's own efforts.

And I think the last big thing we believe is that there needs to be massive public awareness and public education about online privacy. And that actually we think should be primarily funded by the industry.

And, again, we work very closely with the leading companies in the online and mobile space, we know the people at all of those companies quite well. We are big believers in what we call sort of a "sanity, not censorship" approach to anything.

So we understand that this is a complex area, that some of the stuff Ed is talking about and I'm sure that my colleagues up here on the panel will talk about aren't that easy to do. And that said, we think in the best interests of kids overall this -- that this is imperative going forward.

Switching over in a sense, I think -- I was talking to Neera's colleague John Podesta earlier today about this, about his grandchildren, but actually one of the things I would say to everybody here is to move you out, just for a second out of what the classic privacy debate here is in Washington and say to you something that I think is also really, really important in terms of understanding what's going on in childhood and child development today.

I actually think, and I think we at Common Sense believe that the new technology has extraordinary opportunities and pluses on the positive side. It's just transforming economic models lots of different ways that we as consumers and young people can learn and behave.

And that said, it's an extraordinary change in interpersonal human communication and behavior, and it's happened in a remarkably dramatically fast time frame with no mediation by society. It's basically been run, in our opinion, by the technology industry.

And we're not imputing bad motives to the industry, but I will tell you, as the father of four kids and as a teacher and as a professor, that the impacts on cognitive development, child development, the social/emotional behavior of kids because of the extraordinary new technological advances just in the past five years alone is unbelievable and it is changing the nature of childhood.

And so you cannot forget that in the context of sort of narrow discussions about what this law should say or this regulation by Julie Brill or other folks at the FTC should say. This is an extraordinary change in childhood and in learning, in child development, and in the cognitive development and social and emotional development of young people, so we have to build in protections for them. And it's really important that we do that now and that we also begin a very big public discussion of what this is about.

Because ultimately it is going to be primarily in the hands of kids and parents to understand how to deal with this. And I actually think it's in the best interests of everybody out there to broaden the discussion just off the narrow privacy issues, which we know are very important, but into the broader issues of what is happening to people's brains and their social and emotional development today, particularly kids who have very different cognitive and social and emotional needs than adults do and who are growing up in a world that is fundamentally different than everybody out there that I can see grew up in.

So I think that's the context that we ought to look at this in and I think that if you're in the field of kids and teens like Common Sense is that we need to keep really pushing hard for the rights of kids and teens in this debate because ultimately they are going to be the big beneficiaries of some of the good stuff, but the big losers of some of the downsides. And we need to protect them and we need to have a very important public discussion about these issues for years to come.

TANDEN: Great. Thank you so much, Jim.

Let me also introduce Chris Wolf, who leads the global privacy and information management practice at Hogan Lovells, an international law firm with 2,500 lawyers and 43 offices. He also is the founder of and co-chair of Future Privacy Forum, a leading privacy think tank based in Washington.

Chris has been involved in numerous children's privacy issues representing SONY BMG before the FTC in the COPPA settlement consent decree, and in his role as the first outside general counsel to the International Center for Missing and Exploited Children.

Chris, do you have thoughts specifically on the legislation that we're facing, legislation that is both focused on children and more broadly?.

WOLF: I do, but let me preface my description of the Do Not Track Kids bill that's been introduced by saying that very few people disagree with Jim in terms of the need to empower parents and to protect kids and teens better. But its where the rubber hits the road that you really get into some difficulty.

And so as my part of the panel, I want to focus our attention on the bipartisan bill that was introduced on May 13th by Congressman Ed Markey, Democrat of Massachusetts, and Joe Barton, Republican of Texas. It's H.B. (sic) 1895 for those of you counting, and it's called the Do Not Track Kids Act of 2011.

Interestingly, it was introduced one week after a discussion draft was circulated, and there was a lot of discussion and the discussion continued after it was introduced, which may have been the reason why it was introduced to focus the kind of attention that Jim is talking about.

The bill would amend COPPA in some pretty significant ways. It would prohibit online companies from using personal information of children and teens for targeted marketing purposes. It would establish a digital marketing bill of rights that limits the collection of personal information of teens, including geolocation information of children and teens. And in the first U.S. iteration of the European concept of the right to be forgotten, it would implement Jim's proposal for an eraser button that would allow parents to eliminate kids' personal information already online.

Now, clearly this legislation goes well beyond existing law. Commissioner Brill described COPPA, which requires websites aimed at children under 13 or with knowledge that they are collecting from those people, to obtain parental permission. But it seems that the do-not-track-kids regulatory regime would require mandatory age verification of all web surfers, which would raise the kinds of serious constitutional issues that were involved in the COPPA attempts in the late '90s.

And while most people have applauded the motivation of this bill and share Jim's and others concerns about protecting kids, I have to tell you the reactions have been mixed. And when voices as diverse as the Center for Democracy and Technology, the Digital Marketing Association -- Direct Marketing Association, Libertarian Adam Thierer agree on what are the legal and technical issues, it probably makes sense to carefully look at what this bill would accomplish.

So bear with me for a moment. I'm going to quote from some of these critics. CDT said that while the bill is motivated by sincere concerns for the fate of childrens' and teens' personal information online, the proposed law, quoting now, "suffers from familiar problems common to proposals targeting minors data." According to the CDT, the act could lead us down a path of mandatory age verification and increased collection of personal information from all users and could infringe the rights of teenagers to access completely appropriate lawful speech online.

And as I said, the do-not-track-kids bill would create a new prohibition on targeted marketing to children and minors and CDT and the other believe the act defines "targeted marketing" in a way that's overbroad and that sweeps in significant amounts of protected speech. CDT cites as an example that it would prohibit teens from signing up for e-mail alerts about the popular video game Portal Two, to subscribing to newsletters from private colleges they are applying to, from requesting text message reminders to take their asthma medication, or getting alerts from their favorite band about upcoming concert dates.

They go on to say from the mundane to the potentially life- saving, information with any connection to commercial activity would be off limits for teens to request to receive in direct violation of their First Amendment rights. And on the eraser button concept, they said a lot of people support a user's right to remove data the user has posted online in a space under their control, in a social networking profile for example, or on a blog the user operates. But CDT believes the broad notion of an eraser button is too simplistic to deal with the realities and complexities of online data flows.

The Direct Marketing Association, not surprisingly, is another critic to the bill and shares the CDT's concerns about the bill effectively cutting off a broad swathe of communications from colleges to prospective students especially. And they also believe that the mechanism is unnecessary because of the self-regulatory programs the DMA is involved with.

Another critic I mentioned, libertarian Adam Thierer wrote in the Tech Liberation Front blog, quote, "While some of the concerns did motivate the do-not-track-kids act are understandable, there are two very different models on how we might address these problems: Legislate and regulate versus educate and empower. The latter, educate and empower, is the superior framework for dealing with these concerns in light of the practical and principled problems associated with the former, the legislate and regulate path."

So 14 months ago, the FTC started a COPPA review that Commissioner Brill mentioned. When David Vladeck, the head of consumer protection at the FTC was before the Commerce Committee in the Senate recently, Senator Rockefeller expressed some impatience that the review wasn't done, not realizing, I think, that it was being done earlier than statutorily required.

But I think I'll conclude my comments by saying perhaps when we hear from the FTC on its COPPA review, we might be in a better position to consider whether a specific do-not-track-kids law is advisable. And I will also want to comment on something that Ed said. It was specifically the attention that the FTC focused on the online tracking problem that motivated industry, the browser companies specifically, to really spring into action and to do something that's meaningful and important.

And perhaps the attention that Jim and his group and the Markey- Barton bill is focusing on the tracking of kids will likewise inspire industry to do something more in the self-regulatory realm.

TANDEN: Great.

Let me introduce Peter Swire, our last panelist. He is the William (inaudible) professor of law at Ohio State University Law School. He is senior fellow here at the Center for American Progress and has two broad areas of expertise, housing and financial regulation, as well as privacy. He was special assistant to the president for economic policy from 2009 until August of 2010, serving in the NEC under Larry Summers.

From October 2009 until April 2010, he was also the lead person at the NEC on technology issues, including broad band spectrum privacy, cybersecurity, and net neutrality. He also was the lead person at the OMB in the Clinton administration on privacy issues and focused particular attention on HIPAA.

I wonder if you could reply a little bit to Chris' comments and perhaps focus some attention on the idea that obviously some privacy regulations will come at some expense to industry. The question for all of us is where is the balance right between consumers and industry itself.

SWIRE: Thanks, Neera.

You know, I think hearing from Jim about child development is at stake, what our children will grow into as they become adults, and then hearing that legislation could be unconstitutional and ruin technology. We can have that familiar moment of despair in Washington where it just seems we're doomed to give up one of our crucial values, and now we're going to go home and cry or something like that.

So I have a slightly more optimistic view, having lived through some of these privacy debates. And whether you call it balance or just call it some of the steps that are achievable. Julie Brill in her remarks talked about control over your information, and often that's been talked about as choice. And we actually have figured out ways to do that over and over again.

So for HIPAA and medical privacy, we built in things so that your marketing information are not shared without your permission. For Gramm-Leach-Bliley for financial, we have an opt-out before it goes to outside of the bank. COPPA was passed in the late '90's so that children, there's an opt-in story.

But during the last decade, we also got things like canned spam. So it used to be if you kept getting an e-mail, you had no way to unsubscribe and now as a matter of law you can unsubscribe and it works. You know, sometimes you like a car dealer for a while because you're working to buy a car. And then you unsubscribe because you have a car and you don't need it.

We've built these things in by law. We built them in by technology also. It used to be when you downloaded software 10 or 12 years ago, there was often no way to uninstall it. And we sort of later learned to call that spyware. But now standard technology when you buy software is the uninstall stays in your hard drive. When you tired of it or you don't like it, you undo it.

So we found ways technologically through self-regulation legislation to actually find ways to manage things. And one of my questions for kids and privacy is five years from now, what will be obvious that we should have done that we'll all do? And there will be something. I don't know exactly what the list is, but we're going to have some things that we do.

One of the things is eraser button might be relatively hard to undo. It's sort of hard to delete stuff from my computer effectively. It's really hard to delete log files from my website. But in search engines what we did instead was the big search engines said after 90 days or 180 days, we're going to stop linking it to your name. So, you know, we'll have all those searches that Peter does identified by that one user for six months or whatever, but then from a year ago, two years ago, five years ago the search engines no longer associate it with that same identity. So it's a time limit, and that way, you know, the 90 days rolls off and then you're not linked the same way.

That would mean limits on reidentification, on finger-printing and all that. And how to do that is going to be its own trick, but that might be one sensible thing we can do for kids is a time limit, so it rolls over and after a while you're done with it. And there may be ways to do that that haven't been fully explored.

I think -- let me just see. One of the other things I think will be obvious that hasn't been talked about much in the kids privacy debate is, and in the tracking debate, is there is a risk by the employees at the app. There is a risk by the people who are running the databases, that they could be paid by somebody or go rogue or download it or whatever and do a WikiLeaks on everything in the database.

I wrote a paper a couple of years ago called "Peeping," as in Peeping Tom. And politicians get this first. So during the last campaign, Hillary Clinton, Senator Obama, and Senator McCain all had there passport stuff looked at. Now, in federal government we have laws on that, but this has not been talked about in the industry that these are potentially sensitive databases.

Industry tends to say it's not identified. It's just serving ads and all that. And then The Wall Street Journal says, "Here's the 43 things we have to have about a person," and if we download that database to the Internet for people to see, we can be reidentify.

And I think if you want a recent example, Anthony Weiner comes to mind. Like, people do stuff.

TANDEN: I was wondering how long it would get there...

(LAUGHTER)

... before someone mentioned Anthony Weiner. We tried hard for it to be a full hour.

(CROSSTALK)

SWIRE: Well, you know, so there's like politicians do stuff that's dumb, but nonpoliticians do also. And one of the ways to be secure going forward is not to have the employees risking that they are going to blow open the database. So I suspect that will be something.

Time limits will be something. I'm not sure what the rest is, but some of it's the browser settings that Ed was talking about that the major browsers have done. So, rather than total despair, if we had those three and maybe a couple more, we're at least starting to address some of the biggest problems out of this whole thing. TANDEN: Jim is there anything you would like to respond to Chris?

STEYER (?): Well, no, I mean I -- sure, but I thought Chris was quite reasonable as his organization is, and as I think most people are who think about this and try to talk about kids.

I think there are a few big-picture things that strike me. The first is that COPPA was written in 1998, which is like the stone ages in terms of this. So I actually believe should be new legislation written. And whether or not you agree with everything in Markey- Barton. And I understand -- I thought you made good points. I don't believe it requires age verification, but I get your point about it.

I think there ought to be new privacy legislation in the United States now for kids and teens, period. I think that if you think that a law written in 1998 covers kids and teens adequately, I just think that's nuts. And I think that we ought to have a new legislation and it ought to reflect, I'm sure it will reflect, because they're going to spend tens and tens of millions of dollars to make sure it does reflect, industry's perspective.

But I also think it ought to reflect the basic realities of parents and kids and teachers in the United States. Because if you go outside of Washington and you talk with people about these issues, they understand how really a huge deal this is for them. And I would tell you whether its on stuff like opt -- their right to opt-in as opposed to just automatically an opt-out situation which the current structure is, or the idea that we need to as a country and a global world, because this obviously (inaudible) the same, and Julie said that in her remarks, too. You know, what we do here is going to have global implications.

I just think it's time we have a -- I think this discussion is very good, but I think we need new legislation. And it will not be exactly the legislation that Ed Markey and Joe Barton introduced on May 13th, but I believe that there should be new legislation. And I think it should try to balance the best interests of kids in particular.

And when I say "kids," I mean kids and teens, with the important industry imperatives that are involved and the economic issues that are involved, which we completely respect. And I think this is an incredibly important debate.

And the truth is, though, it's been stalled out for a long time. This is the first time we have a really seriously discussion. And by the way, this is our job at Common Sense. Someone asked, look, you have people being paid hundreds of millions of dollars within a two- mile radius of this building advocating on behalf of the large tech companies, who we work very closely with. There is nobody advocating on behalf of kids and parents and families out there to anywhere near the degree. By the way, one of the issues that wasn't -- not anywhere near the same.

So I think these are incredibly important issues. By the way one thing that wasn't mentioned from privacy advocates and industry folks that I think is very important to understand as well is education. You brought up the idea of college. Come on, we'll figure that out. It was a legitimate point, Chris, but obviously you're going to have to carve out exceptions and be mindful of that.

WOLF (?): But you know, a bill almost got passed in the state of Maine that would have cut off communications from colleges and students. so we may figure it out, but the legislators are introducing laws -- proposed laws that if they go into effect really would stop speech. And so...

(CROSSTALK)

STEYER (?): I don't disagree. I agree with you. Fair. I agree. And there will be overreaching bills like COPPA was and there will be people who take it too far.

SWIRE (?): I was at the EGA (ph) forum a couple of weeks ago in France talking about our First Amendment. And I had a French official kind of yell at me, "Stop hiding behind your First Amendment," as if i could control it or as if I have any problem with our First Amendment.

And look, we have to understand that it exists. It will, we hope, continue and Congress just can't continue to pass laws like CDA and COPPA and the do-not-track-kids law just oblivious to the First Amendment issues.

So we really to have to think about what role can technology and education make. And the example that I like to give that's current is the issue of cyber-bullying. When attention was focused because of some really tragic episodes -- someone impersonating a 13-year-old that drove a little girl to commit suicide, the videoing of a student and he committed suicide.

You now see really concerted efforts at sending the message of tolerance and civility online in middle schools and in colleges, and the online services doing more; states passing laws that don't restrict speech, but promote education and promote cooperation in dealing with the issue.

(CROSSTALK)

STEYER (?): Just to respond. Two things I'd say, first of all is look, I teach First Amendment law. I'm a very progressive person. I teach First Amendment law at Stanford. So I always -- I agree with that as strongly as you do. And I'll tell you this -- but I would -- I'll tell you, cyber bullying is a huge deal. The social - and by the way, it's not solved just because there is a little attention right now.

(CROSSTALK)

SWIRE (?): I'm not saying it's solved, but it's certainly getting a lot more attention and there is some progress being made.

STEYER (?): There is, but these issues go to the front -- look, I have four kids and anybody out there whose a parent knows this issue. These issues are not going to go away and the privacy issues are enormously important.

(CROSSTALK)

TANDEN: OK. OK. Let me (inaudible) that conversation

(CROSSTALK)

STEYER (?): The only reason I say that is these debates will happen, but they need to happen with a really strong understanding of the best interest of kids and parents. And they don't. In generally -- and I agree that there are overreaching laws -- but there are going to be voices in there that respect the First Amendment and that try to find things. And that's why this kind of panel is good...

(CROSSTALK)

SWIRE (?): All I'm trying to say is you really doom the discussion when you introduce a law that is so broad and has so many First Amendment problems that people don't take it seriously.

TANDEN: Just on that issue of bringing attention to issues and the role of cyber-bullying and how having a spotlight on them is really helpful, I wonder, Ed, if you can -- and we'll move it to questions from the audience soon -- but if you could actually address the issue that seems like a lot of this world is actually not transparent to consumers. So I don't have any sense of how my information is being sold to people and I don't have any sense of what value that information is.

And if there were mechanisms by which that information could be apparent to me, then issues like personal empowerment and empowerment over data would be more accurate because I would be a consumer that has full knowledge. And I think the challenge of this theory is in some areas it's like a wild wild west and other areas there is a fair amount of regulation.

So if you could speak to the transparency issue and then I think we'd like to take questions from the audience and ensure everyone gets their response as well.

FELTEN: Sure. This I think is a hugely important issue. In some sectors, consumers have a pretty good idea of what information is being gathered and how it's being used. But a lot of the concern for consumers is really driven by the idea that they can't be entirely sure what information is being gathered, and especially can't be sure how it might be used down the road.

There's I think a real concern that some companies are just gathering all the information they can and then trying to find ways to make money off it later. And to the extent that there is an understanding with consumers about what is going to happen and what is the trade-off in giving up the information you are giving up. For consumers who aren't young kids, them aside, then you can get to an understanding. And consumers may be more comfortable with what's happening or they may have a market discussion with sites and advertisers about what's OK and what's not, what they'll accept. And I think one of the arguments in favor of do-not-track is that if done right, it can facilitate that conversation, so that sites will have the opportunity to talk to consumers and say, "Hey, we would like you to agree to allow this kind of collection and that kind of use of the data, and turn that on, and here is the benefit we will give you in exchange."

And I think the closer we can get to a situation where that understanding is explicit and in the open, the better we'll be.

(CROSSTALK)

TANDEN: Great. So let's open it up for questions. Oh, look, there are so many.

If you could identify yourself again, that would be great.

QUESTION: Coming from Europe, I would rather speak of citizens' rights than of consumer rights. And I have a question that sort of leads to the blurring of technology and legal questions. Laws are bound to nations unless they are contracts, international contracts. Technology is spanning the whole world.

Would you maybe elaborate a little bit of how far you think technology makes it impossible to legally govern these issues in an international sort of (inaudible) way?

(CROSSTALK)

STEYER (?): It sounds like a law professor question to me.

(LAUGHTER)

TANDEN: Yes, Peter.

(CROSSTALK)

SWIRE (?): I think that the world-wide nature of the Internet and the differences of law do make it challenging to get to absolutely 100 percent compliance with any kind of a law-based system. But that doesn't mean that it's not worth trying and that we can't provide a lot of benefits to people in -- at least in countries that do have laws in an area.

In practice, the largest and most prominent and respected companies in the industry are likely to go along with the laws that apply to them in the major countries where they do business. And so I do think that there is a lot of leverage that law can get, although it's not going to get you to 100 percent.

And you do need to think about what happens at that last margin. But that's really true of any law. You -- it's rare to get 100 percent compliance with any law.

(CROSSTALK) FELTEN (?): I think one of ways this comes up most often, having taught law of cyber space for a long time, is around the First Amendment. So there is a sort of slogan that the First Amendment is only a local ordinance on the Internet. Right? But on the other hand, that locality is where a lot of things happen, which is the United States. And so there has been a continuing frustration from Europe, from Asia, from a lot of places about things that are legal in the United States.

In France, there was a big case that involved Yahoo where in France it is against the law to sell Nazi memorabilia. But then somebody in France could just go on the Yahoo site or other sites in the United States and buy the same things in the United States. There was a big legal set of rules about it and the upshot was Yahoo had to comply in France, but they couldn't enforce the judgment to stop the continuation of the sale in the United States.

And I think for data collection, for a lot of other things, for defamation claims and a lot of other things, the First Amendment turns out to be a you-can-do-it-here rule on the Internet for a lot of things in the United States, and that is a continuing source of frustration, as Chris said, for a lot of the rest of the world.

SWIRE (?): But you know, I want to mention something that Pete Hustinx, who is the E.U. data protection supervisor, observed when he saw the FTC report and the Commerce Department report. And he looked at what the E.U. is proposing in terms of changing the framework. He saw a lot of commonality and a lot of convergence and harmonization.

And so my hope is that we stop playing this game of "my framework is better than yours," and really examine where there is common ground internationally. European Justice Commissioner Viviane Reding has said she hopes a legislative proposal will emerge that attaches E.U. data protections to every E.U. citizens' data no matter where it goes anywhere in the world. And that's I think just setting up a fight that is unnecessary in light of the fact that we really are cooperating more than ever before.

TANDEN: Additional questions? Over there on the aisle.

QUESTION: This might be a familiar question, but I'm going to bring it up again, which is we saw, for example, the Netherlands, they're moving to requiring opt-in for third-party cookies. And I was actually at the Brussels conference talking about this. And let's for a second say in do-not-track we get into an effective regime where sites are now enabling do-not-track, and we have a good amount of coverage.

The question that I've asked which is -- I don't know the answer to either, but I think it's an interesting one -- is: What happens -- since there will be no requirements or no restrictions that I know of of predicating access to content, what do we do in the case where someone wants to see a (inaudible) video or a funny Adam Sandler clip, and they can only see it if the site -- (inaudible) or allow the site to track them? Given that people are very bad at managing risk and hyperbolic discounting and all these kinds of things, how do we really kind of -- do we actually make things worse? Right? So can they actually follow with more invasive tracking since now there is an affirmative consent (inaudible). I would love to get some thoughts.

SWIRE (?): There is a provision in the Markey-Barton bill, the do-not-track-kids bill, that prohibits conditioning the service on letting them continue to track you. So there might be the day that you're doing the things you need to log-in, but they don't get to keep tracking you over time, according to that bill.

And that sort of purpose limit, use limit, you're only here for today, but we're not going to track you and we can't keep you out, is something we have not seen in U.S. law; something that has been proposed a lot of times and it makes companies very nervous.

FELTEN (?): Isn't that basically saying to a site, though, you have to give your content away for free to somebody who exercises this right, when you really want to sell it or pay value for it.

TANDEN: At least people should know that it's monetized that way.

SWIRE (?): But for instance in HIPAA, there is that rule so the doctor cannot condition your medical treatment on you giving unlimited rights to the information. So we've made that decision in the medical setting. The doc doesn't get to say that. The hospital doesn't get to say that. The Internet folks say we can't have free content unless we condition it, and that's where I think the debate is.

STEYER (?): You know, the doctor -- the service is not dependent or arguably not dependent on getting that kind of permission. Whereas, the site will say...

(CROSSTALK)

TANDEN: Ed, did you want to get in there?

FELTEN: I guess I don't have much to say. This would be a good point to mention that I am not a spokesman for the FTC.

(LAUGHTER)

Having said that, I think where most of the debate has been on this issue is that companies would have an opportunity to make that proposition to the user. That is, we're not going to offer you this unless you agree to make an exception for us, and then leave it in the hands of the user. And I think most people are comfortable with that, again setting aside the issue of kids.

TANDEN: Over here.

QUESTION: I want to talk about the piece of the debate that Peter talked about a lot. And that is about parenting, about the parents, especially the mothers that actually disconnect their children completely from all things online because of these types of things that are going on.

It stretches the concept of fear, but it is really about parents who want to protect their children under every possible way that they can do that, so they do their best to disconnect their children.

So, how do we address that issue in this debate?

STEYER (?): I think it's actually a good question. And I would argue that if we build in much better privacy protections for kids and a much broader public education regime, both, that that in fact will be very good for business in the long run. And I would point to businesses like eBay who have been built on the idea that you can protect your information and keep it private.

I actually think -- look, I work for kids, not for industry. So I'm cynical when I hear -- and I know the guys in the large -- I get cynical about the idea that they won't figure out how to make business work if certain privacy regimes and stuff are done, because I figure they will. And it's a trillion-dollar industry and they will figure out how to keep making trillions of dollars, and we will be the leaders in innovation in society because we will.

But that said, I think actually that by putting in proper new privacy regulations and a broader public education effort, too, which is -- and simpler and easier to understand transparent privacy policies, that I think that's actually going to be very good. Because I think a lot of people who right now will just pull out entirely because their afraid, will say, "OK, there is" -- and that's what doesn't exist now, but I think it can get there. And it will be a negotiation among the various parties involved.

My big concern is will kids and families -- will they be represented at the end of the day. But I actually think in the long run, it will help industry if we have really clear understandable laws. They'll get argued. It will be complicated. We'll have to consider the First Amendment. But I think in the long run, it's a great question.

SWIRE (?): No, I think we're happily at a point now with The Wall Street Journal series that ASCAN (ph) was involved with, and with all the bills that have been introduced -- and look at the size of the crowd here today. That I think businesses are finally getting it that privacy is good for business.

And Ed described the efforts going on. And so for the parent who tries to disconnect their kid, first of all I think they are being naive that by disconnecting the computer at home, they've cut their kid off from the Internet because there are lots of places for them to get it. So one hopes that businesses recognizing that privacy is good for business will build in the kinds of protections we're talking about.

TANDEN: Question over there.

(CROSSTALK)

QUESTION: I just have a simple question. Is the erase button a concept and a process or is it technologically possible?

STEYER: To me, it's a concept that should happen. I am not the person in this room, probably one of the least-qualified people in this room to tell you how it would happen technologically. But I believe that the industry will figure this out. That's another one where I believe -- that's our job at Common Sense. I believe that should happen.

I believe as a parent and even for me as an individual I would like that to happen. But to me, it's a concept and the actual technological development of it, that to me is to the extraordinarily innovative and talented technology industry of the United States or the world to figure out.

TANDEN: Ed, do you want to respond? And then Chris if you want to say anything. .

FELTEN: Sure, yes. Well, let me speak a little bit to the technical feasibility question. And whether it's feasible really depends on what the button is supposed to do. If what the button is supposed to do is to take the information that I uploaded into my account and make sure that it's gone when I delete my account, that's something that companies are able to do.

But if you try to make a button that takes down information about me no matter where it is on the Internet or who put it there, that's a head scratcher how you would actually make that work. And somewhere in between there there's a line between feasible and infeasible.

WOLF (?): And then I am concerned about if you extend that right, if that button is interpreted by some to be a lot broader than it really should be, you get lawsuits like the one in Spain against Google for not removing a link to an unfavorable article about somebody. The person didn't go to the site hosting the article. They went to Google and said, "remove this link." Google didn't and they got sued.

Or in Italy where there was an auto-fill provision. You put in Ed Felten's name and then it put whatever was associated with you -- most logically associated with you. There was an objection to that and there was a lawsuit over that.

So the, you know, the right-to-be-forgotten concept as it plays out in Europe can be really instructive when we talk about the eraser button here.

TANDEN: I think we have time for one more question. Over there -- and I know many of our panelists will stay behind to answer some questions.

QUESTION: Thank you. I work in the area of adolescent health and reproductive health and rights. And I would like to point out that although it's very, very seldom that a minor's interests and her parents interest diverge, my area is one where this happens because the U.S. Supreme Court has given girls under 18 the right in the 37 states that have laws that say they must inform their parents or get consent from their parents if they want an abortion.

The Supreme Court has given these kids the right to, if they choose, go to a judge and have the matter be private. And, you know, it -- I know it feels like such a sliver of the really important and wonderful things that you all have been talking about, but from my point of view, I don't understand what is going to happen to those kids who get their information, to the extent they get it, and it's very difficult for them to get it, from the Internet about where they can go to talk with a judge, where they can go for an abortion. How are they going to get that information under, for instance, Common Sense's view of what -- what they want.

TANDEN: I think that does raise a broad question about, in a particular (inaudible), but in a broader sense, you know, what happens when parents' and children's and teens' interests are not one and the same? And that is obvious in one context, but there broader arenas dealing with child abuse and parents who are abusing their children, they have needs of information.

So, I don't know. Jim if you've...

(CROSSTALK)

TANDEN: ... struggled with that.

STEYER: No, it's a really important question. I mean, I agree. I was thinking of the child abuse area, too. You're definitely going to have areas like that. And I can't sit up here and tell you exactly how you're going to solve it today, but they are very important I respect that issue. I would respect it in the context of child abuse or other types of situations where you're going to have to carve out exceptions.

At the end of the day, you know, this is going to end up being a negotiated process. I'm just glad to see that it's finally being negotiated and discussed because it took an awful lot of time. And I still would say, you know, we haven't had this discussion over a period of years and we need to have this discussion and we'll find some responsible actions probably by FTC. We'll hopefully come up with some kind of legislation that everybody will grudgingly agree was probably necessary.

And we'll have a massive amount of public education that needs to happen. Because all of our lives are being changed by this and we all need to be much more aware of what's going on.

(CROSSTALK)

SWIRE (?): Can I have one sentence on that?

TANDEN: Yes.

SWIRE (?): So one reason that the laws are different for under- thirteens and thirteen to eighteen is that we don't really think eight-year-olds have that same autonomy, but lots of 16-year-olds are exploring things about their sexuality, their religion, whatever it is that their parents shouldn't have to give prior consent to, in my view.

And so we've had these two different categories of law. We probably will in the next round if I were to guess.

TANDEN: Great And with that, thank you all very much.

(APPLAUSE)

END

Комментариев нет:

Отправить комментарий