How the internet can rebuild trust - FT中文网
登录×
电子邮件/用户名
密码
记住我
请输入邮箱和密码进行绑定操作:
请输入手机号码,通过短信验证(目前仅支持中国大陆地区的手机号):
请您阅读我们的用户注册协议隐私权保护政策,点击下方按钮即视为您接受。
人工智能

How the internet can rebuild trust

Algorithms and generative AI models that decide what billions of users see should be transparent
00:00

{"text":[[{"start":null,"text":"

As AI companies fight for dominance, the temptation to embed bias — commercial, political or cultural — into training data will be immense
"}],[{"start":5.8,"text":"The writer is co-founder of Wikipedia and author of ‘The Seven Rules of Trust’"}],[{"start":11.29,"text":"When I founded Wikipedia in 2001, pioneers of the internet were excited by its promise to give the world access to truth and connection."}],[{"start":22.18,"text":"Two decades later, that optimism has curdled into cynicism. We scroll through feeds serving up news we no longer believe, interact with bots we cannot identify and brace for the next synthetic scandal created by fake images from artificial intelligence."}],[{"start":42.06,"text":"Before the web can move forward, it must remember how it earned trust in the first place."}],[{"start":48.49,"text":"The defining difference between web 1.0 and the platforms that dominate today is not technological sophistication but moral architecture. Early online communities were transparent about process and purpose. They exposed how information was created, corrected and shared. That visibility generated accountability. People could see how the system worked and participate in fixing its mistakes. Trust emerged not from perfection (there was still plenty of online trolling, flame wars and toxicity), but from openness."}],[{"start":84.49000000000001,"text":"Today’s digital landscape reverses that logic. Recommendation algorithms and generative AI models decide what billions of users see, yet their workings remain opaque. When platforms insist their systems are too complex to explain, users are asked to substitute faith for understanding."}],[{"start":105.78,"text":"AI intensifies the problem. Large language models can produce fluent paragraphs and convincing deepfakes. The tools that promised to democratise knowledge now threaten to make knowledge unrecognisable. If everything can be fabricated, the distinction between truth and illusion becomes a matter of persuasion."}],[{"start":127.36,"text":"Re-establishing trust in this environment requires more than fact-checking or content moderation. It requires structural transparency. Every platform that mediates information should make provenance visible: where data originated, how it was processed, and what uncertainty surrounds it. Think of it as nutritional labelling for information. Without it, citizens cannot make informed judgments and democracies cannot function."}],[{"start":156.64,"text":"Equally important is independence. As AI companies fight for dominance, the temptation to embed bias — commercial, political or cultural — into training data will be immense. Guardrails must ensure the entities curating public knowledge are accountable to the public, not just investors."}],[{"start":177.42999999999998,"text":"And we must revive civility too. Some of the best early online spaces relied on norms that valued reasoned argument over insult. They were imperfect but self-correcting because participants felt a duty to the collective project. Today’s social platforms monetise outrage. Restoring trust means designing systems that reward good-faith discourse — through visibility algorithms, community-based moderation, or friction that forces reflection before reposting."}],[{"start":212.55999999999997,"text":"Governments have a role to play but regulation alone cannot rebuild trust. It has to be observed in practice. Platforms should disclose not only how their algorithms work but also when they fail. AI developers should publish dataset sources and error rates."}],[{"start":232.7,"text":"The challenge of our time is not that information is scarce but that authenticity is. Important aspects of the early internet succeeded because people could trace what they read to another human being, even if the other human being was operating behind a pseudonym. The new internet must restore that chain of custody."}],[{"start":255.83999999999997,"text":"We are entering an era when machines can mimic any voice and invent any image. If we want truth to survive that onslaught, we must embed transparency, independence and empathy into the digital architecture itself. The early days of the web showed it could be done. The question is whether we still have the will to do it again."}],[{"start":284.46999999999997,"text":""}]],"url":"https://audio.ftcn.net.cn/album/a_1764835851_6780.mp3"}

版权声明:本文版权归FT中文网所有,未经允许任何单位或个人不得转载,复制或以任何其他方式使用本文全部或部分,侵权必究。

对冲基金涌入大宗商品,寻求新的回报来源

包括Balyasny、Jain Global和Qube在内的基金正扩张业务,以便能够直接交易相关金融市场。

大众将迎来其88年历史上的德国本土首次停产

在其关键市场需求低迷之际,欧洲最大汽车制造商在德累斯顿工厂停止生产。

“不过就是一枚炸弹”

两个陌生人和一次勇气非凡的壮举的真实故事。

坐飞机时穿得体面是有道理的

有许多人去机场时都会穿上剪裁合体的长裤、纽扣衬衫、外套和系带皮鞋——而这样做的理由,是我之前没想到的。

AI给我们带来了什么,又夺走了什么?

随着我们接近2025年的尾声,许多人正试图盘点哪些国家引领了AI竞赛、哪些公司从AI中赚得最多。但归根结底,这些对普通人意味着什么?

欧盟计划严打“极其危险”的中国包裹

欧盟司法委员表示,需要采取行动保护消费者免受在希音等平台上销售的产品的侵害。
设置字号×
最小
较小
默认
较大
最大
分享×