听力课堂TED音频栏目主要包括TED演讲的音频MP3及中英双语文稿,供各位英语爱好者学习使用。本文主要内容为演讲MP3+双语文稿:社交媒体如何破坏民主?,希望你会喜欢!
【演讲者及介绍】【Yaël Eisenstat】
在担任中情局分析员、外交官和白宫国家安全顾问多年后,Yaël Eisenstat开始将民间对话的崩溃视为美国民主的最大威胁。
【演讲主题】社交媒体如何破坏民主?Dear Facebook, this is how you're breaking democracy
【中英文字幕】翻译者Bill Fu 校对者psjmz mz
Around five years ago, it struck me that I was losing the ability to engage with people who aren't like-minded. The idea of discussing hot-button issues with my fellow Americans was starting to give me more heartburn than the times that I engaged with suspected extremists overseas. It was starting to leave me feeling more embittered and frustrated. And so just like that, I shifted my entire focus from global national security threats to trying to understand what was causing this push towards extreme polarization at home. As a former CIA officer and diplomat who spent years working on counterextremism issues, I started to fear that this was becoming a far greater threat to our democracy than any foreign adversary. And so I started digging in, and I started speaking out, which eventually led me to being hired at Facebook and ultimately brought me here today to continue warning you about how these platforms are manipulating and radicalizing so many of us and to talk about how to reclaim our public square. I was a foreign service officer in Kenya just a few years after the September 11 attacks, and I led what some call "hearts and minds" campaigns along the Somalia border. A big part of my job was to build trust with communities deemed the most susceptible to extremist messaging. I spent hours drinking tea with outspoken anti-Western clerics and even dialogued with some suspected terrorists, and while many of these engagements began with mutual suspicion, I don't recall any of them resulting in shouting or insults, and in some case we even worked together on areas of mutual interest. The most powerful tools we had were to simply listen, learn and build empathy. This is the essence of hearts and minds work, because what I found again and again is that what most people wanted was to feel heard, validated and respected. And I believe that's what most of us want.
大约五年前, 我突然发现我在丧失 与立场不同的人交流的能力。 单是跟美国同胞讨论 争议性话题这个想法 让我感到心累的程度, 甚至就超过了我和海外可疑的 极端主义者有过的交流。 这开始让我感到更加痛苦和沮丧。 于是就这样, 我把我的焦点 从影响全球国家安全的威胁, 转移到了寻找那个导致 国内民众两极化的原因上。 作为一名前中情局警官和外交官, 我在解决反极端主义的问题上 有着多年的工作经验。 我开始担心,这对 我们国家的民主造成的威胁 会超过来自国外对手的威胁。 于是我开始深究, 开始发声, 后来脸谱网聘请了我, 也最终引领我来到了这里, 使我能继续警告各位, 这些平台在如何 操控和极端化我们的思想, 并谈谈我们应该如何 夺回自己的公共空间。 就在 911 事件发生的几年后, 我曾担任过肯尼亚外交官, 那时我在索马里的 边境领导了一个叫 “心灵与精神”的项目。 我的主要工作是与那些最容易 被极端主义思想 影响的群体建立信任。 我花了大量的时间与 直言不讳的反西方教士喝茶, 甚至跟疑似恐怖份子的人 进行过谈话, 虽然这些交谈都是从 互相怀疑开始的, 他们却从来没有大喊 或说过任何辱骂的话, 有些时候,我们甚至能在 共同感兴趣的领域一起合作。 我们拥有的最有效的工具 就是聆听、学习 还有建立同理心。 这就是“心灵与精神” 这份工作的核心, 因为我多次发现, 他们大多数人想要的 无非是被聆听、被认可, 还有被尊重。 我相信这也是我们 大多数人想要的
So what I see happening online today is especially heartbreaking and a much harder problem to tackle. We are being manipulated by the current information ecosystem entrenching so many of us so far into absolutism that compromise has become a dirty word. Because right now, social media companies like Facebook profit off of segmenting us and feeding us personalized content that both validates and exploits our biases. Their bottom line depends on provoking a strong emotion to keep us engaged, often incentivizing the most inflammatory and polarizing voices, to the point where finding common ground no longer feels possible. And despite a growing chorus of people crying out for the platforms to change, it's clear they will not do enough on their own. So governments must define the responsibility for the real-world harms being caused by these business models and impose real costs on the damaging effects they're having to our public health, our public square and our democracy. But unfortunately, this won't happen in time for the US presidential election, so I am continuing to raise this alarm, because even if one day we do have strong rules in place, it will take all of us to fix this.
可如今,网络上 发生的一切却令我心碎, 也让我明白,这是一个 很难解决的问题。 我们正被目前的 信息网络操纵, 陷入了绝对主义中, 在这里,“妥协”已经成为 一个肮脏的字眼。 因为当今, 类似于脸谱网的社交媒体公司 通过为我们提供 个性化的内容来获利, 而这些内容既认可, 也利用了我们的偏见。 他们的底线就是, 通过激起强烈的情绪 来让我们沉迷, 持续强调那些煽性的, 十分两极化的声音, 直到找到共同点变得不再可能。 虽然越来越多的人开始 要求这些平台做出改变, 但单靠它们是明显不够的。 所以,政府必须定义 这些对社会造成危害的商业模式 需承担什么样的责任, 并让它们因严重损害 我们的公共健康、公共空间, 还有我们的民主付出代价。 但不幸的是,在美国大选 之前这都不可能发生, 所以我只能继续发出警告, 因为即使有一天 我们有了强有力的法律, 还是需要大家一起来解决问题。
When I started shifting my focus from threats abroad to the breakdown in civil discourse at home, I wondered if we could repurpose some of these hearts and minds campaigns to help heal our divides. Our more than 200-year experiment with democracy works in large part because we are able to openly and passionately debate our ideas for the best solutions. But while I still deeply believe in the power of face-to-face civil discourse, it just cannot compete with the polarizing effects and scale of social media right now. The people who are sucked down these rabbit holes of social media outrage often feel far harder to break of their ideological mindsets than those vulnerable communities I worked with ever were.
当我的注意力从海外威胁 转移到已然破碎的 国内民众对话时, 我曾想过,是否可以重新利用 “心灵与精神”这个项目 来缓解国内的分歧。 我们过去 200 多年的 民主实验之所以有效, 很大程度是因为我们 可以公开并且热情地 讨论想法, 以找到最佳解决方案。 我始终坚信, 面对面的民间对话是有效的, 但它确实难以对抗 如今社交媒体带来的 两极分化的效应和规模。 与我曾经合作过的 那些弱势群体相比, 那些沉陷于 社交媒体上激进内容的人 更难脱离他们的固有观念。
So when Facebook called me in 2018 and offered me this role heading its elections integrity operations for political advertising, I felt I had to say yes. I had no illusions that I would fix it all, but when offered the opportunity to help steer the ship in a better direction, I had to at least try. I didn't work directly on polarization, but I did look at which issues were the most divisive in our society and therefore the most exploitable in elections interference efforts, which was Russia's tactic ahead of 2016. So I started by asking questions. I wanted to understand the underlying systemic issues that were allowing all of this to happen, in order to figure out how to fix it.
所以在 2018 年, 当脸谱网找到我, 并给我提供了这份 负责政治广告管理的工作时, 我觉得我必须答应。 我知道我可能 无法解决所有的问题, 但有了这个可以 引导民众的机会, 我必须至少尝试一下。 虽然我们没有研究过两极分化, 但是我确实研究过哪些问题 最容易引起社会分裂, 在选举干扰中也最易被利用, 这也是俄罗斯在 2016 年前的策略。 所以我从提问开始。 我想找到导致这一切发生的 潜在的系统性问题, 以便找到解决的方法。
Now I still do believe in the power of the internet to bring more voices to the table, but despite their stated goal of building community, the largest social media companies as currently constructed are antithetical to the concept of reasoned discourse. There's no way to reward listening, to encourage civil debate and to protect people who sincerely want to ask questions in a business where optimizing engagement and user growth are the two most important metrics for success. There's no incentive to help people slow down, to build in enough friction that people have to stop, recognize their emotional reaction to something, and question their own assumptions before engaging. The unfortunate reality is: lies are more engaging online than truth, and salaciousness beats out wonky, fact-based reasoning in a world optimized for frictionless virality. As long as algorithms' goals are to keep us engaged, they will continue to feed us the poison that plays to our worst instincts and human weaknesses. And yes, anger, mistrust, the culture of fear, hatred: none of this is new in America. But in recent years, social media has harnessed all of that and, as I see it, dramatically tipped the scales. And Facebook knows it. A recent "Wall Street Journal" article exposed an internal Facebook presentation from 2018 that specifically points to the companies' own algorithms for growing extremist groups' presence on their platform and for polarizing their users. But keeping us engaged is how they make their money. The modern information environment is crystallized around profiling us and then segmenting us into more and more narrow categories to perfect this personalization process. We're then bombarded with information confirming our views, reinforcing our biases, and making us feel like we belong to something. These are the same tactics we would see terrorist recruiters using on vulnerable youth, albeit in smaller, more localized ways before social media, with the ultimate goal of persuading their behavior.
现在,我仍然相信互联网的力量 能让更多的声音参与进来, 尽管它们宣称的目标是建立社区, 目前最大的 社交媒体公司的构建方式 与理性言论的概念 却是背道而驰的。 在一个把优化用户参与度 和用户增长量 作为衡量成功的两大指标的行业里, 你没有办法奖励倾听、 鼓励民众讨论, 你也没有办法去保护 那些诚恳提问的人。 没有任何激励因素 去帮助人们慢下来, 没有去建立足够的摩擦 让人们不得不停下来, 认识到自己 对某事的情绪化反应, 并且在参与之前 质疑自己的假设。 令人遗憾的事实是: 在一个优化无摩擦的, 病毒式传播的世界里, 网络上的谎言比真相更吸引人, 淫秽信息胜过了 基于事实的推理。 只要算法的目标是保持我们的参与, 它们就会继续“喂”迎合 我们最糟糕的本能 和人性弱点的毒药。 当然,愤怒、不信任、 恐惧和仇恨的文化: 这些在美国都不是新事物。 但近年来,社交软件 充分利用了这一切, 并且在我看来, 还起到了决定性的作用。 脸谱网也深谙这一点。 《华尔街日报》最近的一篇文章 披露的一项脸谱网 2018 年的内部演示中, 特别提到了该公司自己的算法 让越来越多的极端组织信息 出现在其平台上, 并导致了用户两极分化。 但让我们参与其中 正是它们赚钱的方式。 现代信息环境是通过 我们的信息画像而构建的, 再将我们分割成 越来越细小的类别, 从而完善这个个性化的过程。 然后,我们被可以证实 我们观点的信息狂轰滥炸, 强化我们的偏见, 并且让我们感觉属于某类群体。 这些策略和那些 恐怖分子用于招募 弱势青少年的伎俩如出一辙, 尽管是用比社交媒体更小、 更本土化的方式, 但最终目标都是灌输某种行为。
Unfortunately, I was never empowered by Facebook to have an actual impact. In fact, on my second day, my title and job description were changed and I was cut out of decision-making meetings. My biggest efforts, trying to build plans to combat disinformation and voter suppression in political ads, were rejected. And so I lasted just shy of six months. But here is my biggest takeaway from my time there. There are thousands of people at Facebook who are passionately working on a product that they truly believe makes the world a better place, but as long as the company continues to merely tinker around the margins of content policy and moderation, as opposed to considering how the entire machine is designed and monetized, they will never truly address how the platform is contributing to hatred, division and radicalization. And that's the one conversation I never heard happen during my time there, because that would require fundamentally accepting that the thing you built might not be the best thing for society and agreeing to alter the entire product and profit model.
不幸的是,我从未被脸谱网 授权去产生实质影响。 事实上,在我工作的第二天, 我的头衔和具体工作就发生了变化, 并且被排除在决策会议外。 我最大的努力, 试图制定计划, 以打击政治广告中的 虚假信息和投票权压制 被拒绝了。 所以我只坚持了 不到 6 个月就辞职了。 但这是我在这段期间最大的收获。 成千上万的人在脸谱网工作, 充满热情的投入于这个 他们相信能将世界 变得更美好的产品, 但是,只要公司依然仅仅只是 在内容政策和合理性的边缘试探, 而不考虑 整个平台的设计和赚钱方式, 那它们永远不会真正解决 平台是如何助长 仇恨、分裂和激进这些问题的。 而这也正是我在那里 从未听到过的对话, 因为那将会需要你从根本上接受 你所创造的东西可能 不是对社会最有益的东西, 并同意改变整个产品和盈利模式。
So what can we do about this? I'm not saying that social media bears the sole responsibility for the state that we're in today. Clearly, we have deep-seated societal issues that we need to solve. But Facebook's response, that it is just a mirror to society, is a convenient attempt to deflect any responsibility from the way their platform is amplifying harmful content and pushing some users towards extreme views.
那么我们能做些什么呢? 我并不是说社交媒体 要为我们国家的现状 承担唯一的责任。 显然,我们有一些根深蒂固的 社会问题需要解决。 但脸谱网的回应 只是社会的一面镜子, 这是一种转移其平台 放大有害言论,并将一些用户 推向极端观点的责任的方便说辞。
And Facebook could, if they wanted to, fix some of this. They could stop amplifying and recommending the conspiracy theorists, the hate groups, the purveyors of disinformation and, yes, in some cases even our president. They could stop using the same personalization techniques to deliver political rhetoric that they use to sell us sneakers. They could retrain their algorithms to focus on a metric other than engagement, and they could build in guardrails to stop certain content from going viral before being reviewed. And they could do all of this without becoming what they call the arbiters of truth.
当然,如果脸谱网 的确有这方面的主观意愿, 是可以解决其中一些问题的。 他们可以停止推荐和放大 阴谋理论家、 仇恨组织和 虚假信息散布者的影响, 是的,在某些情况下, 甚至包括我们的总统。 它们可以停止使用 与推销跑鞋同样的个性化推荐服务 来传递政治说辞。 它们可以重新修改算法, 去专注于一个 非用户互动类的度量指标, 也可以建立信息护栏, 防止某些内容在被审核之前 像病毒一样被传播。 他们可以做到所有这一切。 而不必成为他们所谓的 “真理仲裁者”。
But they've made it clear that they will not go far enough to do the right thing without being forced to, and, to be frank, why should they? The markets keep rewarding them, and they're not breaking the law. Because as it stands, there are no US laws compelling Facebook, or any social media company, to protect our public square, our democracy and even our elections. We have ceded the decision-making on what rules to write and what to enforce to the CEOs of for-profit internet companies. Is this what we want? A post-truth world where toxicity and tribalism trump bridge-building and consensus-seeking? I do remain optimistic that we still have more in common with each other than the current media and online environment portray, and I do believe that having more perspective surface makes for a more robust and inclusive democracy. But not the way it's happening right now. And it bears emphasizing, I do not want to kill off these companies. I just want them held to a certain level of accountability, just like the rest of society. It is time for our governments to step up and do their jobs of protecting our citizenry. And while there isn't one magical piece of legislation that will fix this all, I do believe that governments can and must find the balance between protecting free speech and holding these platforms accountable for their effects on society. And they could do so in part by insisting on actual transparency around how these recommendation engines are working, around how the curation, amplification and targeting are happening.
但是他们已经明确表示, 除非被强制这样做, 否则他们就不会 积极采取正确的行动, 毕竟,坦白说, 他们为什么要这么做呢? 市场一直在奖励他们, 他们也没有违法。 因为就现在的情况而言, 美国并没有法律强制脸谱网 或者其他社交媒体公司 去保护我们的公共广场、 民主, 甚至是我们的选举。 我们已经将制定和执行 哪些计划的决定权交给了 盈利性互联网公司的 首席执行官们。 这是我们想要的吗? 一个有毒言论和部落主义 彻底压倒打造沟通桥梁 和追求共识的后真相世界? 我仍然乐观的认为,我们要比 现存的媒体和网络环境所描绘的彼此 拥有更多的共同点, 我也坚信相信,拥有更多视角 会让民主更加健康和包容, 但不是以目前的方式。 需要强调的是, 我并不想搞垮这些公司。 我只想让它们承担 一定程度的责任, 就像社会中的其他人一样。 我们的政府是时候站出来 做好保护我们公民的工作了。 虽然还没有一项神奇的立法 可以解决所有的问题, 但我的确相信, 政府可以,而且必须在 保护自由言论和让这些平台 为它们的社会影响负责之间找到平衡。 他们可以通过保证 这些推荐引擎如何工作, 如何进行管理、 放大和定位的透明度 来做到这一点。
You see, I want these companies held accountable not for if an individual posts misinformation or extreme rhetoric, but for how their recommendation engines spread it, how their algorithms are steering people towards it, and how their tools are used to target people with it. I tried to make change from within Facebook and failed, and so I've been using my voice again for the past few years to continue sounding this alarm and hopefully inspire more people to demand this accountability.
我希望这些公司 不仅是对个人发布虚假信息 或者极端言论负责, 还要对他们的推荐引擎 如何传播这些信息, 他们的算法如何 引导人们使用这些信息, 以及他们的工具 如何被用来针对这些人负责。 我试图从脸谱网内部 做出改变,但是我失败了, 所以在过去几年, 我一直在用自己的声音 不断发出警告, 希望能激发更多的人 来要求这种责任。
My message to you is simple: pressure your government representatives to step up and stop ceding our public square to for-profit interests. Help educate your friends and family about how they're being manipulated online. Push yourselves to engage with people who aren't like-minded. Make this issue a priority. We need a whole-society approach to fix this.
我要传递给你们的信息很简单: 对你们的政府代表施压, 让他们站出来, 停止将我们的公共空间 拱手让给以营利为目的获利行为。 帮助你的朋友和家人了解 他们在网上是如何被操纵的。 推动自己和没有共同想法的人交流。 把这个问题当作优先事项。 我们需要一个能动员 全社会的方法来解决它。
And my message to the leaders of my former employer Facebook is this: right now, people are using your tools exactly as they were designed to sow hatred, division and distrust, and you're not just allowing it, you are enabling it. And yes, there are lots of great stories of positive things happening on your platform around the globe, but that doesn't make any of this OK. And it's only getting worse as we're heading into our election, and even more concerning, face our biggest potential crisis yet, if the results aren't trusted, and if violence breaks out. So when in 2021 you once again say, "We know we have to do better," I want you to remember this moment, because it's no longer just a few outlier voices. Civil rights leaders, academics, journalists, advertisers, your own employees, are shouting from the rooftops that your policies and your business practices are harming people and democracy. You own your decisions, but you can no longer say that you couldn't have seen it coming.
我对我的前雇主, 脸谱网的领导们想要说的是: 现在,人们正在使用 你们设计好的工具 传播仇恨、分裂和不信任, 而你们不仅允许了, 而且还在加剧这种情况。 没错,在全球范围内 你们的平台上发生了很多积极的事, 但这并不意味着这一切 都是可以接受的。 随着选举的临近, 情况越来越糟, 更值得担忧的是, 如果选举结果不被信任, 如果暴力事件爆发, 那么迄今为止 最大的潜在危机将不可避免。 所以到 2021 年,当你们再说, “我们知道我们必须做得更好”时, 我想要你们记住现在这一刻, 因为这再也不只是 一些局外人的声音。 民权领袖、学者、 记者、广告商和你们自己的员工 都会在屋顶上大声呼吁, 你们的政策和商业行为 一直都在危害人民和民主。 决定权在你们手中, 但是你们再也不能说, 你们不可能预见这样的后果。
Thank you.
谢谢。