[ad_1]
Dietram Scheufele is thinking of a “nightmare scenario” for next year’s elections.
Days before Election Day, we’ll finally see deepfakes, or really crafted information, based on artificial intelligence that can influence voters enough to change the outcome of the election. A week later this was not the case.
“There’s no instant replay, there’s no rules like there are for the Super Bowl,” the University of Wisconsin-Madison professor told a packed house at Tripp Commons in Memorial Union Monday night. “It’s just a mistake and we all have to accept it.”
Scheufele, director of graduate studies in the Department of Life Sciences Communication, was on a panel at the Cap Times Idea Fest with Kai-Ping Chen, assistant professor of computer communications, and Kathleen Culver Burgess, journalism ethics chair, also from UC Madison. Participated in the discussion. Moderated by Cap Times provincial government and disinformation reporter Erin McGroarty, three panelists discussed “Journalism in the Age of AI: Who Will Tell Us What’s True?”
Culver noted that AI is already “ubiquitous in our lives.”
She cited a recent example when she asked Spotify to create a radio station with music similar to Cat Stevens songs. This is using generative AI. Almost 90 minutes later, she heard the first song by a female artist on the station and noted the bias the algorithm showed.
“Yeah. I mean, come on, think about the women of that era that I should have listened to,” Culver said. “We generate the list based on what happened in the past. What happened in the past? There’s a lot of sexism and misogyny in the music industry.”
Culver suggested that artificial intelligence “has built-in biases,” meaning that it assumes something is a current fact based on what has happened in the past, even if it is not currently true. There is also a possibility that it will be presented. .
“We have this kind of naive idea that technology is inherently objective and it’s not,” she said. “Our technology is built on the biases we have built in, and that should be a major concern in journalism.”
Once these biases are built into algorithms and eventually start to accumulate on their own, “they’re harmful and very difficult to get out of the system even if you want to fix them,” Scheufele said. But in some cases, he suggested, there could be an incentive for companies to do so. For example, racial bias in facial recognition makes it less useful for police departments to purchase facial recognition.
“I think there’s a commercial interest in fixing it. That’s why I’m optimistic that it will be fixed, at least in some areas,” he said.
Culver said that given the way people get their news, these biases “make human journalism even more important.” However, he acknowledged the risk that trust in journalism could be undermined at an already difficult time for the industry.
An example of what Scheufele could use for journalists is the ability to sort through very large databases and email sets obtained through open records requests and “extract patterns that serve as the basis for solid journalism.” This includes using technology.
Chen said one of her big concerns is who will come up with AI and the rules around its use. At the moment, the discussions are primarily between politicians and the tech companies themselves.
Instead, she suggested there needs to be some way to “bring the public back into this conversation.”
“The people are missing from this conversation,” Chen said.
To help these citizens combat the misinformation and disinformation around them, Culver said, journalism organizations should “put aside the competitive nature of journalism” and communicate with readers and viewers about “what they know.” “We need to work together to share what we know.”
She also stressed that politicians and those who spread misinformation and disinformation have a responsibility to stamp it out, especially in election campaigns.
“Ethics is not just for journalists,” she said. “These candidates have an ethical obligation to the truth, but not an ethical obligation to win.”
“They have a duty of care to the public,” she added later. “If we ignore them and say, ‘It’s just political dirty work,’ we should be ashamed.”
There have been some proposals for government regulation, but there are also challenges with potential ideas. Chen’s proposal: Better education to teach people how to listen, think critically, and act ethically.
“How should citizens be responsible to themselves and each other for the information they share?” Chen said. “It’s more positive and more achievable.”
When asked what news consumers can do to scrutinize information wisely, Culver said they should consider how information makes them feel and “actually feel empowered as a citizen.” ” suggested finding the content. We have a unique attitude that we are willing to believe misinformation if it fits into our existing worldview.
Chen had the simplest suggestion. “Sometimes you just have to not be online all the time.”
[ad_2]
Source link