🎧 openpodme

KategorierSøk Podcast
Computer Says Maybe

Computer Says Maybe

SamfunnTeknologi

Technology is changing fast. And it's changing our world even faster. Host Alix Dunn interviews visionaries, researchers, and technologists working in the public interest to help you keep up. Step outside the hype and explore the possibilities, problems, and politics of technology. We publish weekly.

Siste episoder av Computer Says Maybe podcast

Side 1 av 2
  1. Gotcha! ScamGPT w/ Lana Swartz & Alice Marwick (00:54:31)

    Thought we were at peak scam? Well, ScamGPT just entered the chat.More like this: Gotcha! The Crypto Grift w/ Mark HaysThis is part three of Gotcha! — our series on scams, how they work, and how technology is supercharging them. This week Lana Swartz and Alice Marwick join Alix to discuss their primer on how generative AI is automating fraud.We dig into the very human, very dark world of the scam industry, where the scammers are often being exploited in highly sophisticated human trafficking operations — and are now using generative AI to scale up and speed up.We talk about how you probably aren’t going to get a deepfake call from a family member to demand a ransom, but the threats are still evolving in ways that are scary and until now largely unregulated. And as ever even though the problems are made worse by technology, we explore the limitations of technology and laws to stem the tide.Further reading & resources:Read the primer here!More about Lana SwartzMore about Alice MarwickNew Money by Lana SwartzScam: Inside Southeast Asia's Cybercrime Compounds by Mark Bo, Ivan Franceschini, and Ling LiRevealed: the huge growth of Myanmar scam centres that may hold 100,000 trafficked peopleAl Jazeera True Crime Report on scamming farms in South East AsiaScam Empire project by the Organised Crime and Corruption Reporting Project**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

  2. NYC Live: Let Them Eat Compute (00:52:34)

    This just in with data centers: Energy grids are strained, water is scarce, utility costs are through the roof — ah well, let them eat compute, I guess!More like this: AI Thirst in a Water-Scarce World w/ Julie McCarthyIt was just climate week in NYC and we did a live show on data centers with four amazing guests from around the US…Thank you to the Luminate Foundation for sponsoring this live show and for all of our NY-based friends, and network from around the world that made it to Brooklyn for a magical evening. You can also watch the live recording on Youtube.KeShaun Pearson (Memphis Community Against Pollution) will break down how Elon Musk’s xAI supercomputer is polluting the air of historically Black neighborhoods in Memphis, and how organizers are fighting back against yet another chapter of corporate extraction in their communities.KD Minor (Alliance for Affordable Energy) will demystify the energy impacts of data centers in Louisiana and share organizing strategies to mobilize community opposition to Big Tech and Big Oil infrastructure.Marisol (No Desert Data Center) will talk about their grassroots coalition’s recent win in Tucson to stop Amazon’s Project Blue data center proposal, which threatened the city’s scarce water supply, and how they’re organizing for future protections.Amba Kak (AI Now Institute) will talk us through the bigger picture: what’s behind Big Tech’s AI data center expansion, who stands to benefit from this boom, and what we sacrifice in return.Further reading & resources:Amazon Web Services is company behind Tucson’s Project Blue, according to 2023 county memo — from LuminariaTuscon to create new policies around NDAs following the councils regret around not knowing more about Project Blue — from LuminariaHow Marana, also in the Tuscon area, employed an ordinance to regulate water usage after learning about data center interest in the area.xAI has requested an additional 150MGW of power for Colossus in Memphis, bring it to a total of 300MGWTime reports on increase in nitrogen dioxide pollution around Memphis due to xAI turbinesKeshaun and Justin Pearson on Democracy Now discussing xAI’s human rights violationsMeta’s Mega Data Center Could Strain Louisiana’s Grid — and Entergy Isn’t Prepared — report by the Alliance for Affordable Energy'A Black Hole of Energy Use': Meta's Massive AI Data Center Is Stressing Out a Louisiana Community — 404 Media**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

  3. Are AI Companies Cooking the Books? w/ Sarah Myers West (00:39:54)

    OpenAI just secured a bizarre financial deal with Nvidia — but the math is not mathing. Is the AI sector an actual market, or a series of high-profile announcements of circular relationships between a tiny number of companies?More like this: Making Myths to Make Money w/ AI NowAlix sat down with Sarah Myers-West to go through the particulars of this deal, and other similar deals that are propping up AI’s industry of vapour. This is not your traditional bubble that’s about to burst — there is no bubble, it’s just that The New Normal is to pour debt into an industry that cannot promise any returns…Further reading & resources:More on the Nvidia OpenAI deal — CNBCAnalysts refer to deal as ‘vendor financing’ — Insider MonkeySpending on AI is at Epic Levels. Will it Ever Pay Off? — WSJOpenAI, Softbank, and Oracle spending $500bn on data centre expansion in Abilene — ReutersHow Larry Ellison used the AI boom and the Tony Blair Institute to bolster his wealthOracle funding Open AI data centers with heaps of debt and will have to borrow at least $25bn a year — The Register**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

  4. Gotcha! How MLMs Ate the Economy w/ Bridget Read (00:50:11)

    Multi-level marketing schemes have built an empire by enticing people with promises of self-realisation and economic freedom. The cost is simple: exploit and be exploited.More like this: Worker Power & Big Tech Bossmen w/ David SeligmanThis is part two of Gotcha! Our series on scams, how they work, and how technology is super-charging them. This week Bridget Read came to Alix with a very exciting business opportunity. Bridget authored Little Bosses Everywhere — a book on the history of MLM.We explore how door-to-door sales in the mid 20th century US took on the business model of a ponzi scheme, and transformed the sweaty salesman into an entrepreneurial recruiter with a downline.MLM originators were part of a coordinated plan to challenge the new deal in lieu of radical free enterprise, where the only thing holding you back is yourself, and the economy consists solely of consumers selling to each other in a market of speculation. The secret is, no one is selling a product — they’re selling a way of life.Further reading & resources:Buy Bridget’s book: Little Bosses Everywhere: How the Pyramid Scheme Shaped AmericaFamily Values by Melinda CooperThe Missing Crypto Queen: a podcast by BBC Sounds, about a large scale crypto scam, where there wasn’t even any cryptoLuLaRoe — the pyramid scheme that tricked American mums into selling cheap clothes to their friends and family with the promise of financial independence.My Experience of Being in a Pyramid Scheme (Amway) — a personal account by Darren Mudd on LinkedInWatch our recent live show at NYC Climate WeekSubscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!

  5. Gotcha! The Crypto Grift w/ Mark Hays (00:55:33)

    Hey you! Do you want some free internet money? If this sounds too good to be true, that’s because it is!More like this: Making Myths to Make Money w/ AI NowThis is Gotcha! A four-part series on scams, how they work, and how technology is supercharging them. We start with Mark Hays from Americans for Financial Reform (AFR), and get into one of the biggest tech-fuelled financial scams out there: cryptocurrencies.Like many things that require mass-buy in, crypto started with an ideology (libertarianism, people hating on Wall Street post 2008). But where does that leave us now? What has crypto morphed into since then, and how does it deceive both consumers and regulators into thinking it’s something that it’s not?Further reading & resources:Seeing Like a State by James C. ScottCapital Without Borders by Brooke HarringtonThe Politics of Bitcoin by David GolumbiaLearn more about Americans for Financial ReformCheck out Web3 Is Going Great by Molly WhiteLine Goes Up by Folding Ideas — an excellent survey of all the tactics and rug-pulls during the height of the NFT boomThe Missing Crypto Queen: a podcast by BBC Sounds, about a large scale crypto scam, where there wasn’t even any crypto**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

  6. Gotcha! (00:01:28)

    Gotcha! is a four-part series on scams, how they work, and how technology is supercharging them — running through to October 10.In the series we look at:Crypto: Mark Hays on how a thing touting financial freedom ended up being a kind of fin-cult, rife with scammingMulti-Level Marketing : Bridget Read on the history of the biggest and most successful type of scam that still plagues us todayGenerative AI: Data & Society’s primer on how generative AI is juicing the scam industrial complexEnshittification: Cory Doctorow on his upcoming book, and how the process of Enchittification represents user-hostile practices that scam people into paying more, and ecosystem lock-in

  7. Nodestar: Turning Networks into Knowledge w/ Andrew Trask (00:43:15)

    What if you could listen to multiple people at once, and actually understand them?More like this: **The Age of Noise w/ Eryk Salvaggio**In our final instalment (for now!) of Nodestar, Andrew Trask shares his vision for a world where we can assembly understanding from data everywhere. But not in a way that requires corporate control of our world.If broadcasting is the act of talking to multiple people at once — what about broad listening? Where you listen to multiple sources of information, and actually learn something, without trampling over the control that individuals have over who sees what, when.Andrew says that broad listening is difficult to achieve because of three huge problems: information overload, privacy, and veracity — and we are outsourcing these problems to central authorities, who abuse their power in deciding how to relay information to the public. What is Andrew doing at OpenMined to remedy this? Building protocols that decentralise access to training data for model development, obviously.Further Reading & ResourcesThe Computer as a Communication Device by JCR Licklider and Robert W Taylor, 1968World Brain by HG WellsLearn more about OpenMinedWe’re gonna be streaming LIVE at Climate Week — subscribe to our Youtube**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

  8. Nodestar: Building Blacksky w/ Rudy Fraser (00:41:46)

    Social media isn’t really social anymore. But that might be changing. Rudy Fraser over at Blacksky Algorithms has built something new. He has built the infrastructure to provide a safe online space for the black community, and in the process challenges the ideas of hierarchical, centralised networks. His work — even outside the very cool development of Blacksky — is an amazing, concrete example of how the abstract ambitions of decentralisation can provide real value for people, and sets us up for a new kind of tech politics.More like this: How to (actually) Keep Kids Safe Online w/ Kate SimThis is part two of Nodestar, our three-part series on decentralisation. Blacksky is a community built using the AT Protocol by Rudy Fraser. Rudy built this both out of a creative drive to make something new using protocol thinking, and out of frustration over a lack of safe community spaces for black folks where they could be themselves, and not have to experience anti-black racism or misogynoir as a price of entry.Rudy and Alix discuss curation as moderation, the future of community stewardship, freeing ourselves from centralised content decision-making, how technology might connect with mutual aid, and the beauty of what he refers to as ‘dotted-line communities’.Further reading:Blacksky AlgorithmsBlacksky the app — if you want an alternative to BlueskyMore about Rudy FraserOpen Collective — a fiscal host for communities and non-profitsPaper Tree — community food bankThe Implicit Feudalism of Online Communities by Nathan SchneiderFlashes — a 3rd party Bluesky app for viewing photosThe Tyranny of Struturelessness by JoreenRudy is a technologist, community organizer, and founder of Blacksky Algorithms, where he builds decentralized social media infrastructure that prioritizes community-driven safety, data ownership, and interoperability. As a Fellow at the Applied Social Media Lab at Harvard’s Berkman Klein Center for Internet & Society, he advances research and development on technology that empowers marginalized communities, particularly Black users

  9. Nodestar: The Eternal September w/ Mike Masnick (00:52:43)

    How did the internet become three companies in a trenchcoat? It wasn’t always that way! It used to be fun, and weird, and full of opportunity. To set the scene for the series, we spoke to a stalwart advocate of decentratilsation, Mike Masnick.More like this: Big Tech’s Bogus Vision for the Future w/ Paris MarxThis is part one of Nodestar, a three-part series on decentralisation: how the internet started as a wild west of decentralised exploration, got centralised into the hands of a small number of companies, and how the pendulum has begun it’s swing in the other direction.In this episode Mike Masnick gives us a history of the early internet — starting with what was called the Eternal September, when millions of AOL users flooded the scene, creating a messy, unpredictable, exciting ecosystem of open protocols and terrible UIs.Further reading & resources:Protocols, Not Platforms by Mike MasnickList of apps being built on AT ProtocolGraze — a service to help you make custom feed with ads on AT protoOtherwise Objectionable — an eight part podcast series on the history of section 230Techdirt podcastCTRL-ALT-SPEECH podast**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

  10. Short: UK Groups Sue To Block Data Center Expansion (00:13:56)

    Foxglove and Global Action Plan have just sued the UK government over their YOLO hyperscale data center plans.More like this: Net0++: Data Centre SprawlLocal government rejected the data center. But Starmer’s administration overruled them. They want to force the development of a water-guzzling, energy draining data center on a local community who has said no. And all of this is on the green belt. The lawsuit filed this week might put a stop to those plans.Alix sat down Ollie Hayes from Global Action Plan and Martha Dark from Foxglove to discuss the legal challenge filed this week. Why now? Aren’t the UK aiming for Net 0? And how does this relate to the UK government’s wider approach to AI?Further reading & resources:Read the Guardian article about the suitRead the Telegraph piece about the suitDonate to the campaignData Centre Finder on Global Action PlanComputer Says Maybe Shorts bring in experts to give their ten-minute take on recent news. If there’s ever a news story you think we should bring in expertise on for the show, please email pod@themaybe.org

  11. Big Tech’s Bogus Vision for the Future w/ Paris Marx (00:41:12)

    What’s the deal with Silicon Valley selling imagined futures and never delivering on them. What are the consequences of an industry all-in on AI? What if we thought more deeply than just ‘more compute’?More like this: Big Dirty Data Centres with Boxi Wu and Jenna RuddockThis week, Paris Marx (host of Tech Won’t Save Us) joined Alix to chat about his recent work on hyperscale data centres, and his upcoming book on the subjectWe discuss everything from the US shooting itself in the foot with it’s lack of meaningful industrial policy and how decades of lackluster political vision from governments created a vacuum that has now been filled with Silicon Valley's garbage ideas. And of course, how the US’s outsourcing of manufacturing to China has catalysed China’s domestic technological progress.Further reading & resources:Buy Road To Nowhere: What Silicon Valley Gets Wrong About the Future of Transportation by Paris MarxData Vampires — limited series on data centres by Tech Won’t Save UsApple in China by Patrick McGee**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

  12. Consciously Uncoupling from Silicon Valley w/ Cori Crider (00:53:45)

    How do we yank power out of tech oligarch hands without handing it over to someone else?More like this: Is Digitisation Killing Democracy? w/ Marietje SchaakeCori Crider is a fearless litigator turned market-shaping advocate. She started litigating during many years at leading human rights organisation Reprieve, and then moved on to co-founding Foxglove so she could sue big tech. Now she’s set her sights on market concentration.Cori’s analysis concludes with a hopeful message: we are not stuck in place with eight dudes running the show. In fact, we’ve been here before. The computer age never would have happened the way it did if thousands of patents weren’t liberated from Bell Labs in 1956. How can we use similar tactics to dethrone monopolies and think about how Europe and other large jurisdictions can decouple themselves from silicon valley infrastructure?Further reading & resources:Antitrust Policy for the Conservative by Mark Meader of the FTCThe Open Markets InstituteThe Future of Tech Institute**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**Do you have an idea for the show? Email pod@themaybe.org

  13. After the FAccT: Labour and Misrepresentation (00:50:54)

    Did you miss FAccT? We interviewed some of our favourite session organisers!More like this: Part One of our FAccT roundup: Materiality and Militarisation.Georgia, Soizic, and Hanna from The Maybe team just went to FAccT. Georgia and Soizic interviewed a bunch of amazing researchers, practitioners, and artists to give you a taste of what the conference was like if you didn’t get to go. Alix missed it too — you’ll learn along with her!In part two we look into how AI is used to misrepresent people through things like image generation, and even care labour. These are conversations about AI misrepresenting hidden identities, care work becoming data work, how pride and identity is tied to labour — and how labour organisers are building solidarity and movement around this.Who features in this episode:Priya Goswami brought a multimedia exhibition to FAccT: Digital Bharat. This explores the invisibilised care work and manual labour by women in India, and how their day-to-day has become mediated by digital public infrastructures.Kimi Wenzel organised Invisible by Design? Generative AI and Mirrors of Misrepresentation, which invited users to confront generated images of themselves and discuss issues of representation within these systems.Alex Hanna and Clarissa Redwine ran the AI Workers Inquiry, which brought people together to share in how AI has transformed their work, identify common ground, and potentially begin building resistance.Further reading & resources:Circuit Breakers — tech worker conference organised by Clarissa RedwineKimi Wenzel’s researchBuy The AI Con by Alex Hanna and Emily Bender**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

  14. Short: Musk: Reanimating Apartheid w/ Nic Dawes (00:14:20)

    In May, Grok couldn’t stop talking about white genocide. This injection of right-wing South African politics triggered a conversation with a Musk contemporary, Nic Dawes.In this short Nic shares his perspective on how post-apartheid white communities have dealt with apartheid’s end. And how Musk is basically seeking out an information environment that can recreate the apartheid information system: Grok is just an extension of a media ecosystem designed to soothe guilt and stoke resentment.Computer Says Maybe Shorts cover recent news with an expert in our network. If there is a news story you want us to cover, please email pod@themaybe.orgNic is Executive Director at THE CITY, a news outlet serving the people of New York through independent journalism that holds the powerful to account, deepens democratic participation, and helps make sense of the greatest city in the world. He has led news and human rights organizations on three continents, and was previously Deputy Executive Director of Human Rights Watch, Chief Content Officer of Hindustan Times in Delhi, and Editor-in-Chief of South Africa's Mail & Guardian newspaper.

  15. After the FAccT: Materiality and Militarisation (01:04:20)

    Georgia, Soizic, and Hanna from The Maybe team just went to FAccT. Georgia and Soizic interviewed a bunch of amazing researchers, practitioners, and artists to give you a taste of what the conference was like if you didn’t get to go. Alix missed it too — you’ll learn along with her!In part one we explore the depth of AI’s hidden material impacts, including its use in military applications and to aid genocide. One of our interviewees talked about why they spoke up at the town hall — questioning why FAccT, the biggest AI ethics conference there is, accepts sponsorship from those same military contractors.Who we interviewed for Part One:Charis Papaevangelou who co-organised a CRAFT session called The Hidden Costs of Digital Sovereignty. Greece is trying to position itself as a central digital hub by building data centres and participating in the ‘fourth industrial revolution’ — but what does this actually mean for the people and infrastructure of Greece?Georgia Panagiotidou ran a session on The Tools and Tactics for Supporting Agency in AI Environmental Action — offering some ideas on how the community can get together and meaningfully resist extractive practices.David Widder discussed his workshop on Silicon Valley and The Pentagon, and his research on the recent history of the DoD funding academic papers — is it ever worth taking military money, even for basic research?Tania Duarte offered something very different: a demonstration of two workshops she runs for marginalised groups, to better explain the true materiality of AI, and build knowledge that gives people more agency over the dominant narratives and framings in the industry.Further reading & resources:Recording of Charis’s CRAFT session: The Hidden Cost of Digital SovereigntyCloud hiding undersea: Cables & Data Centers in the Mediterranean crossroads by Theodora KostakaBasic Research, Lethal Effects: Military AI Research Funding as Enlistment and Why ‘open’ AI systems are actually closed and why this matters by David WidderThe video that David quoted the Carnegie Mellon professor from — David was paraphrasing in the episode!We and AI & Better Images of AIMore on Georgia Panagiotidou’s work and resources from her session**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

  16. Making Myths to Make Money w/ AI Now (00:38:08)

    AI Now have just released their 2025 AI Landscape report — Artificial Power. Alix sat down with two of it’s authors, Amba Kak and Sarah Myers-West for a light unpacking of the themes within.This report isn’t a boring survey of what AI Now have been doing this year; it’s a comprehensive view of the state of AI, and the concentrated powers that prop it up. What are the latest AI-shaped solutions that the hype guys are trying to convince us are real? And how can we reclaim a positive agenda for innovation — and unstick ourselves from a path towards pseudo religious AGI.Further reading & resources:Read the AI Now 2025 Landscape Report: Artificial Power**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!***Amba Kak has spent the last fifteen years designing and advocating for technology policy in the public interest, across government, industry, and civil society roles – and in many parts of the world. Amba brings this experience to her current role co-directing AI Now, a New York-based research institute where she leads on advancing diagnosis and actionable policy to tackle concerns with artificial intelligence and concentrated power. She has served as Senior Advisor on AI to the Federal Trade Commission and was recognized as one of TIME’s 100 Most Influential People in AI in 2024.**Sarah Myers-West has spent the last fifteen years interrogating the role of technology companies and their emergence as powerful political actors on the front lines of international governance. Sarah brings this depth of expertise to policymaking in her current role co-directing AI Now, with a focus on addressing the market incentives and infrastructures that shape tech’s role in society at large and ensuring it serves the interests of the public. Her forthcoming book, Tracing Code (University of California Press) draws on years of historical and social science research to examine the origins of data capitalism and commercial surveillance.*

  17. Is Computer Science Made for Dudes? w/ Felienne Hermans (00:54:41)

    Felienne Hermans calls herself an ‘involuntary ethnographer of computer science’. She studies the culture behind programming, and challenges the dominant idea that learning to program has to be painful. Alix and Felienne chat about the history of programming and how it went from multidisciplinary and inclusive, to masochistic and exclusive. They also dig into all the ways it excludes women and people who do not speak English.Further reading & resources:Scratch — a high level programming language aimed at kidsHedy — the programming language that Felienne designedJoin in and help out with Hedy!GenderMag by Margaret Burnett — how to ensure more gender inclusiveness in your softwareElm — an easy and kind browser-based programming languageA Case for Feminism in Programming Language Design by Felienne Hermans & Ari SchlesingerA Framework for the Localization of Programming Languages by Felienne Hermans & Alaaeddin SwidanSubscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!Felienne is the creator of the Hedy programming language, a gradual and multi-lingual programming language designed for teaching. She is the author of “The Programmer’s Brain“, a book that helps programmers understand how their brain works and how to use it more effectively. In 2021, Felienne was awarded the Dutch Prize for ICT research. She also has a weekly column on BNR, a Dutch radio station.

  18. The Elephant in the Algorithm: Live from ZEG Fest in Tbilisi (00:46:15)

    Smart people focused on technology politics issues get it. We trade high level helpful concepts like surveillance capitalism, automated inequality, and enshittification. And even as some of these ideas are making it more mainstream, normies aren’t getting the message. We need stories for that. But how? How do we take the technical jargon and high-level concepts that dominate tech narratives and instead create stories that are personal, relatable, and powerful?And how do we combat the amazing hero-god narratives of Silicon Valley without reinforcing them?Alix went to storytelling festival ZEG Fest in Tbilisi to chat with three amazing storytellers about that challenge:Armando Iannucci, creator of Veep and The Thick of It: who discusses how to use humour and satire to keep things simple — and that stories are not ‘made up’, but rather a way to relay a series of facts and concepts that are complex and difficult to process.Chris Wylie, Cambridge Analytica whistleblower: on how the promise of superintelligence and transhumanism is basically like a religious prophecy. His new show Captured explores the stories that tech elites are telling us about our utopian AI future.Adam Pincus, producer of The Laundromat and Leave no Trace: shares his frustrations with the perceived inevitability of AI in his day to day, and also tells us more about his podcast series ‘What Could Go Wrong?’ in which he explores writing a Contagion sequel with director Scott Burns.Further reading & resources:Captured: The Secret Behind Silicon Valley’s AI Takeover — limited podcast series featuring Chris Wylie**‘Contagion’ Screenwriter Scott Z. Burns Asks AI to Write a Sequel to Pandemic Film in Audible Original Series ‘What Could Go Wrong?’** — Variety articleWhat Could Go Wrong? — limited podcast series by Scott Burns**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**

  19. Is Digitisation Killing Democracy? w/ Marietje Schaake (00:37:56)

    There has been an intentional and systematic narrative push that tells governments they are not good enough to provide their own public infrastructure or regulate tech companies that provide it for them.Shocking: these narratives stem from large tech companies, and this represents what Marietje Schaake refers to as a Tech Coup — which is the title of her book (which you should buy!).The Tech Coup refers to the inability of democratic policymakers to provide oversight, regulation, and even visibility into the structural systems that big tech is building, managing, and selling. Marietje and Alix discuss what happens when you have a system of states whose knowledge and confidence have been gutted over decades — hindering them from providing good services, and understanding how to meaningfully regulate the tech space.Further Reading & Resources:Buy The Tech Coup by Marietje Schaake**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**Marietje Schaake is a non-resident Fellow at Stanford’s Cyber Policy Center and at the Institute for Human-Centered AI. She is a columnist for the Financial Times and serves on a number of not-for-profit Boards as well as the UN's High Level Advisory Body on AI. Between 2009-2019 she served as a Member of European Parliament where she worked on trade-, foreign- and tech policy. She is the author of **The Tech Coup.**

  20. AI in Gaza: Live from Mexico City (01:00:02)

    This episode contains some descriptions of torture methods, automated human targeting by machines, and psychological warfare throughoutLast week Alix hosted a live show in Mexico City right after REAL ML. Four panellists discussed a huge important topic, which has been wrongfully deemed as taboo by other conferences: the use of AI and other technologies to support the ongoing genocide in Palestine.Here’s a preview of what the four speakers shared:Karen Palacio AKA kardaver gave us an overview of Operation Condor — a program of psychological warfare that ran in the late 20th century in South America to suppress activist voices.Marwa Fatafta explains how these methods are still used today against Palestinians; there are coordinated surveillance projects that make Palestinian citizens feel they are living in a panopticon, and the granular data storage and processing is facilitated by AWS, Google, and Azure.Matt Mahmoudi goes on to describe how these surveillance projects have crystallised into sophisticated CCTV and facial recognition networks through which Palestinians are continuously dehumanised via face-scanning and arbitrary checks that restrict movements.Wanda Muñez discusses how fully autonomous weapons obviously violate human rights in all kinds of ways — but ‘AI ethics’ frameworks never make any considerations for machines that make life or death decisions.Further reading & resources:The Biometric State by Keith Breckenridge — where the phrase ‘automated apartheid’ was conceivedCOGWAR Report by Karen Palacio, AKA KardaverSubscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!Wanda Muñez is an international consultant with twenty years of experience in the design, implementation and evaluation of programs and policies on human rights, gender equality, inclusion and the rights of people with disabilities. Wanda has worked for international NGOs and UN organizations in Asia, Africa, Europe and Latin America. She became involved in the field of artificial intelligence in 2017, initially through the analysis of its intersection with International Humanitarian Law in the issues of autonomous weapons systems; and later focusing on the intersection between human rights and AI. In 2020, she was nominated by the Ministry of Foreign Affairs of Mexico as an independent expert at the Global Alliance on Artificial Intelligence (GPAI), where she contributed to various publications and panels, and led the design of the research “Towards true gender equality and diversity in AI” that is currently being implemented. In 2020, Wanda Muñoz was recognized by the Nobel Women's Initiative as "a peacebuilder working for peace, justice and equality" and by UNLIREC as one of Latin America's "forces of change, working for humanitarian disarmament, non-proliferation and arms control. Wanda also just recently won DEI Champion of Year Award from Women in AI.Karen Palacio, aka kardaver, is an interdisciplinary digital artist, industrial programmer specialized in AI, and data scientist from Córdoba, Argentina. She researches and creates through iterative loops of implementation and reflection, aiming to understand what it means to articulate artistic-technological discourses from the Global South. Her performances, installations, and audiovisual works engage critically and rootedly with the depths of computation, the histories of computing and archives, freedom of knowledge, feminisms, and the pursuit of technological sovereignty. She develops and works with Free Software in her processes, resemanticizing technologies she knows from her background as an industrial programmer.Dr Matt Mahmoudi is Assistant Professor in Digital Humanities at the University of Cambridge, and a Researcher/Advisor on Artificial Intelligence and Human Rights at Amnesty International. Matt’s work has looked at AI-driven surveillance from the NYPD’s surveillance machine to Automated Apartheid in the occupied Palestinian territory. Matt is author of Migrants in the Digital Periphery: New Urban Frontiers of Controls (University of California Press, February 2025), and co-editor of Resisting Borders & Technologies of Violence (Haymarket, 2024) together with Mizue Aizeki and Coline Schupfer.Marwa Fatafta leads Access Now’s policy and advocacy work on digital rights in the Middle East and North Africa (MENA) region. Her work spans a number of issues at the nexus of human rights and technology including content governance and platform accountability, online censorship, digital surveillance, and transnational repression. She has written extensively on the digital occupation in Palestine and focuses on the role of new technologies in armed conflicts and humanitarian contexts and their impact on historically marginalized and oppressed communities. Marwa is a Policy Analyst at Al-Shabaka: The Palestinian Policy Network, an advisory board member of the Tahrir Institute for Middle East Policy, and an advisory committee member for Bread&Net. Marwa was a Fulbright scholar in the US and holds an MA in International Relations from Maxwell School of Citizenship and Public Affairs, Syracuse University. She holds a second MA in Development and Governance from the University of Duisburg-Essen.

  21. Logging Off w/ Adele Walton (00:43:57)

    Adele Walton’s new book *Logging Off: The Human Cost of our Digital World* is out NOW — for this week’s episode Alix sat down with her to discuss the book, and what pushed her to write it.Adele shares her experiences of using social media from age ten, and growing up only ever feeling ‘understood’ by her followers. And now, the constant ‘how can I make content out of this??’ mindset has followed her into adult life.Adele has been severely effected by online harms through the loss of her sister, and is working to use her lived experiences in her campaigning and advocacy work. The answer for Adele has never been to go full Luddite and reject social media — rather she wants to make online spaces safer for everyone.Further reading & resources:Buy Adele’s book: Logging Off: The Human Cost of our Digital WorldThe Facebook Eye by Nathan Jurgeson — 2012 article from The AtlanticSmartphone Free ChildhoodRipple — a suicide prevention browser extension**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**Adele Zeynep Walton is a British Turkish journalist, online safety campaigner and the author of Logging Off: The Human Cost of Our Digital World. She is a campaigner with Bereaved Families for Online Safety, youth ambassador for People Vs Big Tech, and a founding member of the EU youth movement Ctrl + Alt + Reclaim.She is the founder of Logging Off Club, a community which brings people together offline at phone free events to reconnect with themselves and others across the UK. As a Gen Z who grew up on social media, Adele regularly speaks about digital wellbeing, social connection and rebuilding empathy in a polarised world.Adele has written for The Guardian, The Independent, the i, Dazed, i-D, VICE, Metro, Refinery 29, The Big Issue, Jacobin, Open Democracy, gal-dem, Computer Weekly and more. Her articles have been translated into Brazilian Portuguese, German, Italian, Swedish, Turkish and Spanish, and she has been interviewed on Times Radio, LBC Radio, Sky News, BBC Radio Scotland and Channel 4 News and more. Between 2023-2024 Adele was DAZED's first ever political book columnist, where she has interviewed authors including Naomi Klein, Emma Dabiri, Vicky Spratt and more.

  22. Short: Sam Altman’s World w/ Billy Perrigo (00:20:53)

    Sam Altman is doing another big infrastructure push with World (previously Worldcoin) — the universal human verification system.We had journalist Billy Perrigo on to chat what’s what with World. Is Sam Altman just providing a solution to a problem that he himself caused with OpenAI? Do we really need human verification, or is this just a way to side-step the AI content watermarking issue?Further reading & resources:The Orb Will See You Now by Billy PerrigoThe ethical implications of AI agents by DeepMindComputer Says Maybe Shorts bring in experts to give their ten-minute take on recent news. If there’s ever a news story you think we should bring in expertise on for the show, please email pod@saysmaybe.comPerrigo is a correspondent at TIME, based in the London bureau. He covers the tech industry, focusing on the companies reshaping our world in strange and unexpected ways. His investigation ‘Inside Facebook’s African Sweatshop’ was a finalist for the 2022 Orwell Prize.

  23. The Collective Intelligence Project w/ Divya Siddarth and Zarinah Agnew (00:54:31)

    Most of the time we interview people who say No to AI. In this interview, Georgia and Alix talk to two people who look at AI and ask How and For What. And lots of other questions too.Divya Siddarth and Zarinah Agnew from the Collective Intelligence Project share CIP’s work using AI systems to explore more consultative democratic governance, how to reframe the social and relational of knowledge, to pull our thinking out of the individual frame and into collective and communal applications.In Zarinah’s words, they are interested in what happens “between brains, not within brains”. A ‘community chat bot’ might sound cringe but Divya and Zarinah are doing work to make these valuable and useful, rather than addictive and sycophantic. If you’re skeptical of the utility of engaging in these toxic corporate towers of AI at all, this is an episode for you.Further reading & resourcesWhy We Need an Amistics for AI by Brain BoydCollective constitutional AI project with CIP and AnthropicGlobal Dialogues launch announcementI Tested The AI That Calls Your Elderly Parents If You Can't Be Bothered by Joseph Cox from 404 MediaWorker Power & Big Tech Bossmen w/ David SeligmanThe Orb Will See You Now by Billy PerrigoThe Intimacy Dividend by Shuwei Fang**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**Divya Siddarth is the executive director and co-founder of CIP. Previously, she has been a political economist and social technologist in Microsoft’s Office of the CTO, the AI and Democracy lead at the U.K.’s AI Safety Institute, and held positions at the Ethics in AI Institute at Oxford, the Ostrom Workshop, and the Harvard Safra Center. She graduated from Stanford with a B.S. in Computational Decision Analysis in 2018.Zarinah is Research Director at the Collective Intelligence Project, where they work on transforming public input into impactful change in the AI ecosystem. Previously a neuroscientist, Zarinah now focuses on the science of collectivity and emerging related technologies. Zarinah is faculty at the London College of Political Technology where they teach on Future Crafting. In their spare time, some might argue, they run too many non-profits.

  24. Net0++: Data Center Sprawl | NEW Research from The Maybe (00:54:09)

    We’re excited to finally share our report on data center expansion and resistance around the world. It’s been a labor of love, but also showcases the amazing work of many organisations, activists, and journalists around the world that are working to create space for meaningful consultation about hugely consequential decisions. Download it here.In short, the report includes five case studies on data centre development across the globe. We were focused on understanding how companies approach policymakers, what information is made available to communities, how decisions are made to develop data centers, and when communities decide to resist their development, and what the outcomes have been.The ONE big similarity across all case studies is that information about data centre development was consistently hard to find: accessing information about environmental impacts, urban planning, and even the identity of the companies proposing these projects, has been almost impossible to uncover.We end the report with some recommendations for how to increase transparency and crack open democratic consultation of communities on the front lines of this behemoth tech infrastructure.Further reading:Read the report here!A short More Perfect Union doc about living 400 yards from a data centerData Center DynamicsxAI's Memphis Neighbors Push for Facts and Fairness from Tech Policy PressIf you have any thoughts or feedback about the report, please email research@themaybe.org**Subscribe to our newsletter to get invites for community calls around data centre resistance.***Chris Cameron has been a scientist and researcher for over a decade and has been working in environmental justice policy since 2021. Her interest in investigating human rights violations related to environmental injustices has led to her current research into strategic litigation support for communities experiencing harm from data centers. Chris’s previous work has centered around co-designing projects with communities related to environmental rights advocacy and digital storytelling. She also hosts a radio show called Sound Ecology, a space for climate-oriented artists to share their sonic investigations as toolkits for the climate collapse. Contact Chris at cameroncscoop@gmail.com to speak more about data center litigation strategies and the intersection of technology and environmental justice.**Prathm Juneja is the Research Strategist at The Maybe and a PhD Candidate in Social Data Science at the University of Oxford, where his research examines, from a technical and ethical perspective, AI & Elections. He works at the intersection of AI, research, industry, and politics, spending most of his time advising governments, civil society organizations, and companies on civic tech and tech policy.*

  25. Net 0++: AI Thirst in a Water-Scarce World w/ Julie McCarthy (00:53:29)

    Last year, Elon Musk’s xAI built a data centre in Memphis in 19 days — and the local government only found out about it on the 20th day. How?Julie McCarthy and her team at NatureFinance have just released a report about the nature-related impacts of data center development globally. There are some pretty dire statistics in there: 55% of data centers are developed in areas that are already at risk of drought. So why do they get built there?Julie also shares the longer arc of her career, which began in extractive industry transparency, and included time leading the Open Government Partnership, and the Economic Justice Program at Open Society Foundations. She brings all of that experience together for an insightful conversation about what iss happening with tech infrastructure expansion and what we should do about it.Further reading & resources:Kate Raworth’s Doughnut EconomicsNatureFinance websiteNavigating AI’s Thirst in a Water-Scarce World — by NatureFinanceElon Musk building an xAI data centre in 19 days — report by Time MagazineOSF’s Economic Justice ProgrammeThe Entrepreneurial State by Mariana Mazzucato**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**Julie is NatureFinance’s CEO.  She was founding co-director of the Open Society Foundations’ (OSF) Economic Justice Program, a $100 million per annum global grantmaking and impact investment program focused on issues of fiscal justice, workers’ rights, and corporate power.  Previous roles include serving as the founding director of the Open Government Partnership (OGP), and as a Franklin Fellow and peacebuilding adviser at the U.S. Mission to the United Nations, focused on Liberia. Prior to this, McCarthy co-founded the Natural Resource Governance Institute (NRGI), serving as its deputy director until 2009. She is a Brookings non-resident fellow in the Center for Sustainable Development, and an Aspen Civil Society Fellow. Julie lives with her three children in Warwick, NY.

Side 1 av 2
Se podcasten hos PodMe