Playback speed
×
Share post
Share post at current time
0:00
/
0:00

Technology- Better Life or Bigger Risk?

An Interview with Adam Kovacevich

Editor’s Note: We hope you enjoy the video above. If you’d rather just listen to the podcast, click the button below to Apple Podcasts: The Common Bridge. It is also available on all other podcast platforms. We have included the transcript to this program below. We offer this program in it’s entirety to our paid subscribers, and welcome all to subscribe below.

Listen to Podcast

Richard Helppie

Hello, and welcome to The Common Bridge. We have a great topic today that affects everybody, whether you know it or not, that's the role of technology in our lives. The implications of technology as a means of wielding power or as a means of informing, what's the role of information, misinformation and disinformation. Today, our guest from a trade association known as the Chamber of Progress, Mr. Adam Kovacevich. Adam, welcome to the show.

Adam Kovacevich

Thanks for having me.

Richard Helppie

We're going to talk a little bit about law and society and how oftentimes technology runs ahead of the law. We're trying to figure out how to codify and maintain our constitutional republic form of democracy while we embrace these new technologies, hopefully, to make a better world. Adam, our audience likes to know a little bit about our guests so, if you don't mind, tell us a little bit about where your early days were and a little bit about your professional arc and what you're up to today.

Adam Kovacevich

Sure. I grew up in Bakersfield, California, my dad's a farmer, but I did not get the farming genes. I came east for school and eventually ended up working on Capitol Hill for my local congressman; I got the political bug. He was one of the founders of what were called the New Democrats, which were the late 90's Clinton/Gore era moderate Democrats. One of the things that they were really interested in was technology, promoting the US tech industry. That opened my eyes to the tech industry, which was nascent at that time, and in its issues. Since then, I've really spent my career at the intersection of politics and the technology industry. I ended up working a couple of years on Capitol Hill, then I went to a small company called Google, when it was setting up its Washington DC office. I was the seventh or eighth employee in that office, and ended up staying there for about 12 years working on all of the public policy issues affecting the company. I then went to a company called Lime, which is one of the companies that provides the scooters you see in many cities; the shared electric scooters, and that was a lot of fun. But I was interested in starting this group because frankly, I think I saw - particularly democratic policymakers - used to be, I think, pretty excited about the technology industry. And I think that now, with last couple years, some have taken a more negative turn. I don't believe that's totally consistent with where voters' opinions are and I was interested in doing something about it. So it's kind of a new tech policy industry group.

Richard Helppie

What's the mission of the Chamber of Progress?

Adam Kovacevich

One of the things I always liked about working in the technology industry is the way that I think a lot of technology services help advance what I consider to be progressive societal goals. So for example, when I worked at Google, one of the big controversies I worked on was the Google Book Search project, because that project was scanning millions of books and libraries and it was taking, I would say, an aggressive but grounded approach to copyright policy. But when you really take a step back from that project, its goal was to make university library contents available to anybody. It used to be that you'd have to live near a university library to have access to those books. And so projects like that have always gotten me excited about the potential of technology in the way that it's gotten rid of information middlemen, I think, and driven costs lower for consumers; all of that's really exciting. There's just kind of a cool factor, like the way that you think about the drone that might deliver a package to our house someday. So I've always been excited about that aspect of technology, that potential for technology, but I also think we need to be honest about the fact that technology does have downsides. It has negative impacts that need to be regulated, that need to be mitigated. We should have smart rules in place in regulation to prevent that, and we should also make sure that we have rules in place to make sure - particularly the biggest - companies operate responsibly and in the interest of consumers. I think some traditional business groups might say, no rules at all, kind of untrammeled free market, where we take a little bit different approach.

Richard Helppie

I've been in technology businesses for a long time, and in fact pioneered some of the things that became to be known as groupware and hotelling and telecommuting and those types of things - obviously, ancient history from those beginnings - but my observation was that we hadn't socialized those new technologies yet. Nobody knew how to behave politely. I'm a perpetual optimist, I think we're getting there but I can imagine if you're a member of Congress, that all of a sudden, people are doing things differently and what do we need to do about it? Maybe it's nothing, as you said, unfettered free enterprise, or maybe it's something like it needs to be centralized. I think there's a lot of room between those poles. Where can our listeners, readers, and viewers find out more about your organization?

Adam Kovacevich

On the internet: Progress.Chamber.org is our website and we have links to our Twitter feed and videos there. So everything's...it's a really good collection of all of our work.

Richard Helppie

And who funds Chamber of Progress?

Adam Kovacevich

We're a business trade association so we're funded completely by companies. We have about 32 companies that are among our funders, from big companies like Apple and Google and Meta, Amazon to innovators like Cruise and Waymo that are autonomous vehicle companies, companies like Instacart and DoorDash involved in delivery. FinTech companies, like Chime, and increasingly some of the companies that are innovators in the cryptocurrency space as well. So we like to be transparent about who our funders are and we're 100% supported by companies.

Richard Helppie

How would those sponsors define success for Chamber of Progress?

Adam Kovacevich

The way I like to think about it is that I think that technology as an industry is not yet being regulated maturely. By which I mean, I think technology had a long political honeymoon and then really, probably around the time 2016 happened, a lot of the technology industry started leaving its dirty laundry on the floor and a lot of policymakers threatening a divorce. I think we're still in the phase where a lot of the ideas that you see from policymakers - the bills and regulations they propose - are, I consider, somewhat extreme reactions to what they're seeing. I think we will get to the phase of technology regulation where we regulate modestly to address a true consumer pain point. For example, I remember 2008-2009 when there were a bunch of instances of people getting on commercial airplanes and sitting on the tarmac for like, five hours; they couldn't go to the bathroom, they weren't given snacks - you probably remember this. And it led to...that summer, there was a whole consumer outcry against Congress, and Congress directed the FAA to do something about it. They came up with what became known as the three hour rule. And the three hour rule says you can't sit on the tarmac for more than three hours without giving people notification, bathroom, snacks, etc. It worked. This was an amazing example of a regulation that worked in the sense of like it almost eliminated this problem overnight. They didn't break up Delta Airlines, they didn't tax United Airlines. They applied a rule that made sense across the board to all airlines big and small, again, to respond to a consumer pain point. I don't see right now in tech policy and innovation, a lot of ideas like that where they're responding to a genuine consumer outcry. Typically the ideas that are being pushed are being pushed for some other reason, a lot of times by the other industries that have been disrupted by technology.

Richard Helppie

That's a great example, in my humble opinion. And the challenge with technology is its moving so fast [that] the regulators are trying to hit a moving target. If we can go back to the breakup of AT&T where the judge decided that it was better to break them up into the "baby Bells" versus keeping this one behemoth. IBM was restricted and in the 90s, there was antitrust action against Microsoft. But today, I mean, things are just moving so much faster. How do you regulate something when people really don't know how to use it yet? We really don't understand it and by the time you find out what the problem is, it's just way too late. Are you running into anything like that?

Adam Kovacevich

Oh, absolutely. I think you see this all the time. For example, I think one of the most notable examples of this right now is about...get my time right...about ten years ago, Facebook acquired Instagram. And then eight years ago, it acquired WhatsApp. Both of those were small companies that did not have a lot of - they were growing, they were successful - but they were early in their lifecycle and Facebook recognized an opportunity there and I think that made both services even more successful. A couple of years ago, when there started to be, I think, more anxiety about Facebook's power as a company, policymakers and regulators at the Federal Trade Commission, in particular, launched a lawsuit against Facebook saying that those acquisitions of WhatsApp and Instagram eight and ten years ago, respectively, were illegal, should not have been allowed in the first place and should be unwound. And I think in some ways, they're penalizing Facebook for having successfully integrated those companies into WhatsApp. But if you just take a look at the broader competitive environment in social networking, TikTok is increasingly eating Facebook's lunch, they are on the growth trajectory. They're the hot app for young people in particular and yet, all these government resources are going to fighting the last war. I think it does take regulatory humility to say, look, we are concerned about power, but let's see where there is true concentration of power. Social media, for example, I just think is so fickle, because every new generation of young people wants its own space so it's really hard for a social networking service to hold on to its pole position culturally. I think every day that goes on [with] this Federal Trade Commission suit against Facebook, now known as Meta, looks more and more outdated because of the rise of TikTok. So I think that's a good example.

Richard Helppie

That's a fantastic example and not without historical parallels. You think about television programming, my parents - since deceased - [would say] what time is something on? Versus [cross talk]

Adam Kovacevich

I don't think we'll ever go back to that, no we'll never go back to that anymore.

Richard Helppie

My generation, it's time shifting and now my children, it's like, TV, what do you care...we go get what we want, when we want it. Streaming...

Adam Kovacevich

And I mean, that's funny, because we still have rules in place at the Federal Communications Commission about how many newspapers you can own in a local state TV market, and how many broadcast stations you can own. I understand that, that's based in part on those services being broadcast over the air, and they're using government granted spectrum so government has a stake in that. But you look at policies like that you think, well, it's not hard for people to find a diversity of views anymore.

Richard Helppie

One thing I'm curious about, why do you identify as a center left organization?

Adam Kovacevich

A great question. It's a little unusual, admittedly, but part of the reason I was interested in doing that was because I've spent a lot of my career in Democratic politics and I had this experience of seeing Democrats be excited about the technology industry for a long time. And then, really, after Trump got elected, I think, many elected Democratic policymakers started taking a more negative turn. And in my view, most Democratic voters don't feel that way, don't feel that negative turn. But one of the things I think is true about politics today - and I think in some ways this is unfortunate, but I think it's also true - is that both parties are having a debate within each party about how they handle something just as I think within the Republican Party there's some traditional free market voices versus more of a MAGA nationalistic voice. I think that's also true within the Democratic Party when it comes to debates about technology and innovation. I think there's one strain of thought that in the Democratic Party tends to be very naturally suspicious of power and the view that technology services must be predatory or hurting people in the background without their knowledge. Then I think there are other views that frankly, technology services make my life much easier, more convenient, and I'm excited about the opportunities that it brings. And so what I see now in the political landscape is that it's really important to be part of that intra-party debate within the party. I also frankly, saw that there were other organizations like mine who are solely focused on Republicans. So admittedly, it's a choice that I think reflects the political environment today, for good or for ill.

Richard Helppie

Good for you for being right upfront about where you stand, and probably good to find a center right person even better and have you both on at the same time (Adam Kovacevich: Yeah, absolutely.) for a conversation, because these are universal communicators. One the things that you talk about on your website is digital opportunity for all; I can't overemphasize the need for that. Again, historical precedents: rural electrification, universal phone service, when we had landlines, there was always a base level of service so that everyone had a telephone so you wouldn't be cut off. And if you look what happened during the pandemic, schools were closed, and some more wealthy districts said, go ahead and work from home, no problem, they go into the private bedroom, they pop up their computer, they download Zoom. Then you've got places that I'm very familiar with where there were no devices, no place to work, and no WiFi in the home. I believe that we need to open that digital connectivity, so that we get everybody in; we can't have the technology haves and the technology have nots.

Adam Kovacevich

Absolutely, when you hear these stories about people going to the public Library parking lot and parking their car outside just to be able to get a portion of the library WiFi signal so that they can apply for jobs or their kids can go to school, I mean, it kind of breaks your heart. It does, I think, point out a need for significant investments in broadband access. The infrastructure bill that was passed last year in Congress definitely does that, you've seen states increasingly doing that. We're big supporters of that. I'm very optimistic. I do think that...it's interesting, when I worked at Google an interesting thing that happened, policymakers from other parts of the world and other parts of the country, would come to the Google headquarters in Silicon Valley and they would say, how do we create our own Silicon Valley in Ohio or Tennessee or wherever the case may be. There's always this pursuit of good jobs, better jobs and the old playbook for that was let's attract a company to bring its regional headquarters or headquarters or warehouse here and that will draw workers and then that will lead to more tax revenue and better schools and so forth. With, increasingly, the remote work revolution, I think it means that any city could become a tech city, could become an attractor of high tech jobs but they're competing on something different, which is quality of life. So I think that will be good, it has been good and will be good for diversifying the gains of technology to more parts of the country so that more parts of the country can lay claim to being beneficiaries of the tech economy.

Richard Helppie

I think your first example about getting libraries from universities online is clearly a democratization of that information and it's generally fairly high quality, at least there are citations, you can see who did the writing, how they did the research and such and make your own judgments. Those are some of the benefits. And of course, there are many more, but what about the risks that we see; things like the ability to track, and we've actually seen the ability to make a non-person out of people: coordinated de-platforming, the ability to track and stop your car. We were reading recently that if you're near the border with Canada coming in, the Department of Homeland Security is alleged - and I don't know this for a fact - but alleged to be able to download everything off your cell phone and keep it for ten years. What about the risks of that digital world that we're running into? Is this something you deal with or are you more about the content and information and such?

Adam Kovacevich

No, I think what you just described is a consistent theme in technology policy debates and regulation debates, which is to say that every awesome tool that we use can also be someone else's weapon. All the things you described, and I think...to criticize the technology industry for a bit, I think that in the past, there was sometimes a naivete about the way that a positive service or tool will be abused. People hadn't necessarily thought about all the ways it will be abused. But I think increasingly, I do believe that companies are getting smarter about this. So for example, when I worked at Lime, which is one of the shared scooter companies - and I think the shared scooters are an amazing service for getting around cities more efficiently - but there were issues around where should they be parked. How do you park the scooters to make sure they're not blocking, for example, a handicap access curb cut or sidewalk? Then how do you police that behavior? It's a great thing that these scooters are dockless, that makes them more accessible but how do you encourage responsible parking? My general feeling about this is, unfortunately you see some policymakers adopt a view that because of the negative we should stand on top of the whole thing and hit the pause button. For example, some of the union organizations are very opposed to autonomous vehicles because they're concerned about them having a negative impact on driver jobs - which I'm very sympathetic to - but to the point that their official position is there should be no autonomous vehicles without a human driver. I just don't think that's a realistic position or a credible position. It would be much smarter, for example, to take an area like that and say, look, there's going to be this big transition and there are going to be driving jobs that are affected, there are going to be new job types of jobs that are going to be created, what can the government do to help train today's drivers to fill new kinds of jobs that are going to be created? That's the responsible thing to do. I think people might feel a little less anxious about it if they saw that government policy was stepping in to help in that situation. I do think ideal regulation in any situation is preserving the beneficial use of the service while mitigating the 5% - 10% cases of abuse, and trying to get that down to be as low as possible, lower than 5% hopefully.

Richard Helppie

I think this is a universal truth of the human experience. If you go back to the 1900s, 50% of the United States workforce was involved in agriculture and now it's 2%. So from the view when mechanization was coming in, it would be well 48% of the people are going to be out of work and clearly that didn't happen. When you think about automobiles - and I'm a proud native Detroiter, alright, so I've seen a lot of this - what we really did is we took the horse and buggy model: the stable behind your house, now it's a garage behind your house; you've got a hitching post, now you've got a parking space. You own the horse, you feed the horse, now you own the car, you maintain, you insure and you store the car - we're going to a new place. I've spoken a number of people that are in the automobile industry, segments of the population - folks that have disabilities, movement restrictions and the like - being able to call up a vehicle that they need when they want it, versus having to maintain, store, and insure one; it's going to be a much better transportation and another role for big tech. Waze or Google Maps right now telling you where the traffic jams are, the cars are going to know where the other cars are, so I think we're going to a new place and there will be disruption. I don't think anybody could really say, well, what are those new jobs going to be? But they'll be out there doing something. (Adam Kovacevick: Yep. That's right.) So the political pressure for a party in power to use or misuse some of the information; so Facebook/Meta can be pressured to throttle a story. We saw that happen in the last presidential election cycle. We had a pandemic and there was some good information put out and also there were a number of impeccably qualified people who were kicked off Twitter, and they ultimately ended up being right. So how do we rein in the instincts of those in power from wielding that power in an unhealthy way through the technology?

Adam Kovacevich

Look, I think this question about content moderation and how the big tech platforms engage in content, moderation - using their editorial judgment - is probably one of the biggest central issues in tech policy debate right now. I think, it's largely influenced by the partisan divide too, particularly around speech. And so, it's interesting, I think that a fair amount of policymakers talk about - when it comes to the tech platforms now - is what some people call jawboning, which is policymakers using their bully pulpit to try to push Facebook or YouTube or Twitter to take down or not take down certain types of content. They're actually appealing to and putting pressure on the companies themselves. And this question of how the service is engaged in content moderation has just become harder and harder. There are whole departments within each of the companies that are typically called Trust and Safety. What I think people don't totally understand about them is [that] all these services all remove content that is clearly illegal. So child pornography is clearly illegal, services work actively to remove that through a combination of human review and algorithms. But then there's a whole category of speech that is not illegal, but which raises questions. And I think all the platforms have all started from a freedom of expression baseline, they want to allow as much speech on their platform as possible from varying views. But then you get into these situations that just challenge that norm. So you think, for example, of the tide pod challenge, where people were swallowing Tide Pods. It's free speech, but it's also dangerous. Or you think about somebody doxing somebody; if I had a dispute with you and I go onto YouTube and I post your home address, and I send a bunch of fake packages to your house or whatever, that's legal speech. But most of the platforms have adopted a whole range of policies to deal with these situations where it's speech that's lawful, but awful, and awful, of course, is in the eye of the beholder. What's one person's awful might be another person's perfectly acceptable and there's no pleasing everyone. But I will say, I think that the last couple years have just tested that. When, in the sense of like, Trump, COVID and January sixth, I mean, never before have we seen an insurrection on the Capitol. So I think it's the situation where if you're Twitter or Facebook and you have this norm, okay, we let people use our services to express themselves, then they have to ask themselves the question, well, are we comfortable with people using our service to broadcast their ransacking of the Capitol building? I think not all of them are. Or comfortable with letting Trump on, and I know it's hard because I think that when people see Twitter kicking off Trump, it tends to reinforce - in many Republicans, conservatives eyes - this sense of big tech, coastal elites ganging up to suppress our views. But these are private services, they can allow whatever they want and I think for them to make the choice that like, okay, we draw the line at people using our platform for insurrection, I think is a reasonable choice for them to make. Other services might make the choice differently, but this is why this is one of the most contentious areas right now, this content moderation question.

Richard Helppie

It is, and I know one thing that did trouble me when Jen Psaki was the press secretary for President Biden, and just came out said, yeah, we send things over to Meta and tell him that we think this is a problem. That's where I get concerned, is about people in power. And as far as a former president [with his] 3 am tweets while he's sitting on the throne, sending out stuff, I was like, yeah, here it comes again. Part of me, it's like, go ahead and do it, show people who you are and let people decide.

Adam Kovacevich

Let people decide, yeah.

Richard Helppie

Because I think that a lot of what we've seen going on is really making him something of a mythical creature. You get into "the enemy of my friend is my enemy" and I think it's just driving the divide unintentionally. I'm more of the kind of guy like, hey, just put it out there and let people be seen. And then of course, Sam Harris admitted conspiracy to withhold a factual story about things that would be potentially damaging to President Biden vis-à-vis bribes and things through his son. After the election [he said] yeah, well, it was true. I'm not sure if that's really more of a technology question or a freedom of the press question, but it's out there. Could they have affected it as well, absent the tech platforms?

Adam Kovacevich

I think that's why these are hard questions. For example, I think there's a little bit of a tendency to be informed by the last controversy. After 2016, there were a lot of efforts by the platforms to say, we're going to work extra hard to make sure that the Russians can't infiltrate our systems like they think they did before, but maybe that makes them a little bit blind to the new threat. I do think that there are hard questions. Frankly, some of the platforms have said that in the early days of even COVID, that they made certain decisions about blocking or even labeling tweets or posts they wouldn't do today with the benefit of hindsight. That's what's often hard about content moderation is that the services know how to deal with the established issue, but the novel issue is sometimes challenging. I think, by the company's own admission, sometimes they get the balance wrong. So it's just that we now see new questions come up in this area of content moderation all the time. I think that's what makes it challenging.

Richard Helppie

Indeed, it's the weaponization of the technology that I wrote about in my most recent column about narrative versus news. So many of our consumers of news look at an article and skip over the fact that the source is people close to the matter or a confidential person that has subject matter knowledge. It's like, tell us what you're talking about and let people make that decision. Then also, there needs to be that counter narrative so that people can sort that out. But we've now divided into these camps of affirmation programming on the right, affirmation programming on the left, and the algorithms are going to keep feeding stories based on what a person likes. I hope to baffle them, because I read everything. But I will tell you something, Adam, this is not new. My first job in computer systems, I worked for a company - I didn't know it at the time - but they were the largest private databases in the world, the RO Poll Company for doing direct mail. What we could gather about a person was astonishing, we were getting auto registration lists and census tract information and magazine subscriptions. We could build a profile on a person...(Adam Kovacevich: A data broker of its day.) It was a data broker, that's exactly what. An automotive company would come in, like Ford Motor is introducing a new Mustang, and they'd say that they wanted to target people driving a Camaro that are in census tracts with income over this, that also subscribe to these kinds of magazines. And we would write an algorithm to go find these folks. It was data mining that was being done but with not nearly the technology power that we have today because back then it was pretty easy to get erased. But today I'm wondering what about young people that are recording their every move on social media? It's like, they're not going to want that as an adult, as every adult who grew up without that. It's like, I wasn't there, didn't happen that way.

Adam Kovacevich

No, I would actually argue that today's teens and tweens haven't actually internalized that lesson and that's a big driver of the success of services like Snapchat and Instagram Stories; Snapchat having completely disappearing messages and Instagram Stories being a little bit less ephemeral, but still somewhat ephemeral, too. Because I do think that people...again, in your teens, you're kind of experimenting, you're trying on an identity and you don't really want it to chase you everywhere. So I think that that's a big driver for the success of those services.

Richard Helppie

You know, one of the other things when I look at risks on the technology platform - and now the situation that's coming to the fore, we're about to get some movement on it - but the fact that there's been direct extraction from tech platforms by our government. That they've gone in to take information out and James Clapper said, yeah, that's the thing we should be doing now. It's like, no, that's not a thing we should do. And you look at the Washington Post pivot from supporting what Edward Snowden was doing, to now what Snowden was doing was a really bad thing. What do we do about limiting government's ability to harvest our data and our information?

Adam Kovacevich

Well, in fact, I mean, this is something I've also worked on, particularly earlier in my career. There are a whole range of proposals that would require government to go through more hoops, and particularly, requiring a judicial warrant rather than the lower standard of a subpoena. Although most of the platforms now do require law enforcement to come to them with a warrant. But one of the things we do see law enforcement doing is [that] law enforcement will go to say, Google or Facebook, and ask them for a user's data because they're the subject of investigation or there's some other subject. A lot of times the law enforcement will gag the company from telling the user. They'll come with a nondisclosure order, preventing Google or Facebook from telling the user that their data has been requested by the government. This is something that the companies would like to see change; they'd like to make it harder for the government to get one of these gag orders on the platform so that they can tell the user. It's not to say that they're not going to deny the government that requested it, to come to them, from a warrant, the companies want to be responsible in cooperating with lawful requests from law enforcement. But then there's, for example, a bill in Congress by Senator Ron Wyden, that would prevent any federal law enforcement agency from buying data from a data broker, like the one you you used to work at. That hasn't gone anywhere yet but I do think that's an interesting proposal to deal with this problem as well.

Richard Helppie

The government scooping up additional data that they then can mine. So instead of a specific thing about a particular crime where there's probable cause, they're gathering (Adam Kovacevich: Yeah.) a wide range of information and I think that has got really serious consequences. (Adam Kovacevich: I agree.) I had Professor Daniel Crane from University of Michigan Law School on my show early on, because the way that Hitler used the monopolies in Germany - or the duopolies, wasn't a monopoly - to take control of the country was astonishing. He only had to control two airplane companies, two pharma companies, etc, which he did. And now when you look at the concentration of information in the hands of a few companies, that temptation for someone to take us to a totalitarian state is great for people who think that way.

Adam Kovacevich

So what you're describing, sometimes people talk about what's called a geofence warrant. So geofence warrant might see police in New York City go to Google and say, okay, Google, give us the names of all of the people who are in Times Square, when this crime happened. What typically happens - I know this from having worked with one of the companies, Google - is that the company pushes back on that. The companies almost all pushed back on these extremely broad requests, saying no, no, no, that's overly broad. If you come to us and say, we have reason to suspect John Doe and here's a warrant, we'll give you John Doe's information. But these over-broad geofence data requests are something that the companies resist. Now, there are actually proposals - I know there's one in New York State, and I think, potentially in California as well - that would restrict the ability of law enforcement to even issue these kinds of geofence data requests. Of course, the typical response from law enforcement is well, there are cases where it might be valid. A lot of policymakers are loath to cross law enforcement on this question. So that's why these proposals tend not to get very far. But that's definitely something that has been part of the the tech policy debate.

Richard Helppie

Indeed, and I know that our military has artificial intelligence and big data systems that they've been using to track terrorists. Sometimes it's as little as metadata; I know that our predictive analytics will say, we're going to get this person we're tracking, in two weeks time is going to be in this village without an entourage, that gives us an opportunity for a drone strike. And I'm thinking if the military has that, civilian implications are huge. But Adam, is the real big issue that we're dealing with First Amendment issues? I understand that what you may be prohibited from saying, as you were talking about child pornography, things that are clearly reprehensible, but is there any risk about who's going to be allowed to speak? And then the other thing if you could maybe shed some light on, does content moderation make that platform company a publisher and subject to defamation or slander or libel?

Adam Kovacevich

Two important things, there's a lot there. One, First Amendment doesn't protect your right to say what you want anywhere you want to, particularly if it's on a privately run service. So for example, there are a whole number of Supreme Court cases that basically say that the operator of a website or an ISP has their own First Amendment right to decide which kind of speech is allowed on their platform. This is the question at the heart of two laws that were passed by Republicans in Texas and Florida last year that essentially restrict platforms' ability to engage in content moderation. I hope, and I predict, that the companies, the platforms, will ultimately win those cases at the Supreme Court, because the Supreme Court generally has recognized that frankly, there's a long tradition of - even from the bake shop case, the gay marriage bakeshop case - that recognizes companies have their own First Amendment rights. Whether you disagree with that or not, that's an established Supreme Court precedent; the Hobby Lobby case, for example. I think that that will eventually uphold the rights of Google, Facebook, Twitter, to moderate content as they see fit and will strike down those laws. This question of does this make them a publisher, when they moderate is really a misunderstanding of the law. People sometimes look at something like section 230 of the Communications Decency Act and suggest that if they engage in content moderation, then they lose their liability protection; they're no longer a dumb pipe. That's not how the law works. Section 230 was meant to incentivize the platforms to actually engage in content moderation by saying that even if you do engage in content moderation, you're still not liable for the users' individual tweet or post. Because if they were, they would never be able...that service wouldn't even exist if Twitter was liable for every dumb tweet.

Richard Helppie

Let's just say that we agree about what objectionable content is.

Adam Kovacevich

Yeah, which we'll never agree, by the way. The definition of that...objectionable is like...

Richard Helppie

Yeah, and I don't want to go down the Alex Jones path because...

Adam Kovacevich

Yeah, we'd never be 100%.

Richard Helppie

We have to assume there are places that we could agree. What...I was trying to pose the question about who can be moderated or censored - whichever word you want to use - versus, well, we're afraid that person might say something, so we're not going to let them on? Is there any...

Adam Kovacevich

I don't think the platform should ever say we're...I think...and I don't think there's any...well, so for example, like, all of the platforms have rules that if you are affiliated with a foreign terrorist organization on the State Department list you're not going to get an account. They might deny that person an account from the very beginning.

Richard Helppie

How far a step is that from, you're deemed a MAGA Republican, and therefore, we're not going to give you a spot on our platform?

Adam Kovacevich

I don't see any of the platforms doing that, none of the platforms have done that. I think that would be a huge mistake for both stifling expression and also, frankly, making their services unappealing to a big swath of the US population. I don't think it's in their interest, I haven't seen that. One of the things I think all the platforms have moved towards is more of a strikes system to say like, okay, if you have one post that violates our system you're going to get a warning. We're not going to kick you off right away. I think that makes it...both Twitter and Facebook do this. I think that makes sense because what they're trying to say is, look, we're going to have standards, but we're going to give you some sense to get in line with those standards. Now, I think increasingly, there are other places you can go, you can go to Parlor, you can go to Truth Social if you don't like the...you can go to Rumble if you don't like YouTube. So there are there are alternative places to go and the existence of those alternatives, I think, makes it harder for people to say I don't have anywhere, I can't express my views.

Richard Helppie

I've got a lot of other questions about differences in companies and such, but as we near the end of our time together - and you've been very generous with your time - advice for our audience in any actions that they should take, or is there anything we didn't cover that we should have talked about today? Or any policy perspectives that you'd like to make our audience aware of?

Adam Kovacevich

No, I think the thing I would just leave with is the comment I made earlier, which is I think that we are currently in the stage of tech policy debates where, again, a lot of the debate is about people trying to push their perspective on content moderation, and a lot of that's partisan from both sides. But I think with respect to how we regulate technology, I'm very optimistic that we can get to this next stage where we focus on encouraging what's good about a particular service or technology and focusing and trying to prevent what's bad about it - after it happens, not stop the technology in the cradle because of what we're concerned might be happening. I think the same would be true of speech, to make sure that when you're talking about something like content moderation, let people speak, but be clear about what the rules are and enforce the rules as fairly as possible.

Richard Helppie

Kind of avoiding the tech equivalent of book burning so nobody hears or sees from that author.

Adam Kovacevich

I think that would be a mistake because I think the fact that one of the things that's great about the internet is the ability to get to all this stuff. That doesn't mean any individual companies have to have an obligation. The New York Times doesn't have to publish my letter to the editor, I'm not being censored by them not publishing my letter to the editor. But the point is, what's great about the internet overall, is you can pretty much find anything you're looking for.

Richard Helppie

Right. It's peer to peer communication, and we're crushing hierarchies because of the technology, yet as we saw early on, on the internet no one knows you're a dog (Adam Kovacevich: That's right.) where they're coming from. Adam, any closing thoughts for our audience today?

Adam Kovacevich

No, thanks for having me. This has been great.

Richard Helppie

Great. We've been talking today with Adam Kovacevich from the Chamber of Progress. My hope is that we can have him come back because there's a lot to talk about on this. This technology is going to affect all of our lives. Until then this is your host Rich Helppie, signing off on The Common Bridge.

Announcer

Thanks for joining us on The Common Bridge. Subscribe to The Common Bridge on substack.com or use their Substack app where you can find more interviews, columns, videos and nonpartisan discussions of the day. Just search for The Common Bridge. You can also find The Common Bridge on Mission Control Radio on your Radio Garden app.

Transcribed by Cynthia Silveri

0 Comments
The Common Bridge
The Common Bridge
Authors
Rich Helppie The Common Bridge