Image courtesy of Facebook.

In Isaac Asimov’s “Foundation” series, social scientists predict the fall of the galactic empire they call home. Then they set off on a grand, universe-spanning mission to protect the collective knowledge of humanity and discreetly steer history in an effort to shorten, if not prevent, the looming dark age.

Widely considered landmark texts in American science fiction, the books had a profound effect on Professor Kathleen Carley of Carnegie Mellon University. She was especially taken with the idea that group behavior, even at the largest and most complex scale, is understandable — and is even within our power to guide. 

“The ‘Foundation’ series is probably what inspired me to go to college, to get a degree, to do the work I do,” recalls Carley, who has spent her career studying dynamic social networks.

Now, as the director of Carnegie Mellon University’s new Center for Informed Democracy and Social Cybersecurity (IDeaS), Carley will have a chance to lead her own grand mission: fighting the spread of online disinformation.

The Center is being funded by a six-year, $5 million award from the Knight Foundation.

Speaking to NEXTpittsburgh, Carley says the Center will take a broad, interdisciplinary approach to the issue: “We want to be able to bring together policy perspectives, education perspectives, technological perspectives, to provide a more integrated approach to understanding, making sense of and countering the spread of disinformation.”

As public outrage over the spread of false information on social media has grown in the last several years, tech CEOs such as Mark Zuckerberg and Twitter’s Jack Dorsey have touted artificial intelligence as their solution for protecting users from bad actors and information online.

While new applications for AI and machine learning will certainly be a part of the Center’s research, Carley pushes back on the notion that “fake news” is purely a technical problem.

“AI is an important component, but it is not going to solve everything,” Carley says. “Nor will AI, in the absence of good policy and good education, even address all the right issues.”

The Center won’t officially open until the fall, but Carley and the first cohort of grad students will be continuing research projects that have already started elsewhere on campus. Much study is still needed, but Carley says a few conclusions are already clear.

“We need new laws and new policies to protect our ability to interact without being unduly influenced by others,” says Carley. Given the world-spanning reach of many social networks, “we need to think about this not just from a national, but from a global perspective.”

Viewing our current internet ecosystem from the perspective of Asimov’s scientists, she says, “I think people might look back and say, ‘Boy, it was the Wild West all over again, but in digital form.’”

Though she’s optimistic that some change will come, Carley also says the problem of online deception is likely to remain an issue for some time.

“I do think we will see new policies and laws coming out that will regulate these things,” she says. “But I also think that the technology will be evolving so fast that it will stay kind of Wild West-ish for quite a while.”

Bill O'Toole was a full-time reporter for NEXTpittsburgh until October, 2019. He previously reported in Myanmar.