Policy Fiction: what science fiction can teach policy geeks
We are used to studying the past to better understand the present. But what about using AI to learn about the future in a context of increasing uncertainty?
Through storytelling generated by artificial intelligence, we propose an exercise in speculative policy scenarios. Instead of the two- to three-year horizon that dominates policy discussions, we propose a multi-decade timeframe that is more in keeping with the actual lifecycles of most policy projects.
Our goal is to enable policymakers and the general public to fully visualize how decisions made today could impact people’s lives well into the middle of the 20th century. We would do this not by describing events such as infrastructure projects or the passage of new laws, but rather by telling “retrospective social histories” that bring the human effects of these changes dramatically to life.
We aim to push the debate about policies beyond the traditional realm of specialists, government authorities and investors, and to spark broader conversations about what it means for everyday life.
Just as science fiction helps us to imagine how disruptive technologies can lead to surprising alternate futures, these “policy fiction” stories will take narrative liberties grounded in realistic facts. We will create detailed prompts to develop, through GPT, one pessimistic, one optimistic and one “business as usual” scenario. These will include “plot twists” on topics such as job generation, water use, endangered species, corruption, gender equality and the development of local value-added industries, among others. At least one of the stories—inevitably—will include some outrageous tweets from Elon Musk.
Each of the stories, which would be around 1,000 words, could also be packaged as videos or audio-visual presentations that could kick off discussions at in-person or online events. After these workshops, further pieces could be produced with the most innovative insights, such as infographics or interactive scrollytellings.