People have been wrong or outright lying since they were old enough to play âwhisper in the alleyâ or âphoneâ at recess. Now that the story is transmitted tweet by retweet and post by repost, rather than word of mouth, we use terms like âdisinformationâ and âdisinformationâ to describe, respectively, the accidental or intentional hijacking of the message. A high-profile report on the phenomenon, recently released by the Aspen Institute, suggests that the ability of social media to accelerate the relay and reach of misleading and biased information to an increasingly isolated and gullible audience. Questionable information, due to longstanding social networking inequality, has created what researchers call an “information disorder.”
Denise Agosto, PhD, a professor at the College of Computing & Informatics has studied informational behaviors online for more than two decades. She suggests that in order to begin to solve the problem, we all need to become more reliable and demanding receivers and conduits of information from an early age. Teaching students to be adept, not only in understanding the information presented to them, but in assessing the credibility of its source, is a lesson primarily highlighted in the context of writing articles by research or library courses. But Agosto argues that information literacy skills – collecting, organizing, evaluating and communicating information – should be a fundamental part of the curriculum from the start. Agosto and his team examined how schools are currently teaching these skills with regards to collecting information on the Internet and created a number of resources for teachers, librarians and parents to help children learn more about information and maintain a healthy relationship with technology during the pandemic.
Agosto recently took the time to review the Aspen study and provide some insight into how information literacy education might tackle the roots of information disorder in our society.
The report is keen to explain that our struggles against disinformation / disinformation do not take place in a vacuum – suggesting that, in some way, our broader social issues (e.g. structural inequalities) have made people be victims of the dissemination of false and misleading information. stories. Is this something that you also observed in your research? Why is this background important to guide any efforts that may begin to tackle our “information disorder”?
Yes, it is crucial to understand that disinformation and disinformation are not, in and of themselves, the root causes of the problems we see in online discourse. Rather, they are manifestations of existing societal problems. These are usually long-standing social issues, such as social power imbalances, political intimidation, economic inequalities and racism. The spread of disinformation through the media is not new either. It can be traced in the mass media at least to the height of yellow journalism in the United States in the late 19th century, and probably much further than that.
If these issues and the existence of disinformation are not new, then why are disinformation and disinformation of such urgent concern today? Part of the current upsurge in disinformation comes from highly polarized political circles in the United States and several other countries today as well. This conflicting political partisanship fuels and is fueled by the nature of social media itself, which allows biased, misleading and inflammatory information to spread much faster than ever to many more people than ever before. Thus, social media serves to multiply the speed, reach and influence of disinformation and disinformation. And the damage it creates can be much worse than in previous media environments, when the news and information cycle was much, much slower and most often included professional news organizations with editors, auditors. facts and established journalistic standards.
So what can we do to combat the current and rapid spread of disinformation and disinformation online? The most effective approach must be a multi-stakeholder strategy. To date, in the United States, we have ceded the responsibility of monitoring online speech to ISPs – the social media companies. This ad hoc regulation has proven to be uneven and ineffective, conferring disproportionate power on the companies that operate these platforms and undue influence on advertisers and other self-promotion entities. We need to explore effective government interventions that more equitably represent the values ââof the communities they represent. Educational intervention is also crucial. Educators at all levels should teach their students not only to identify biases in media systems and resources, but also to question existing power structures that give rise to the social problems and social injustices that they inflame. often. And of course, economic intervention is also needed. All of these partial solutions require funding for successful development and implementation.
How has this reality guided your research and your approach to the problem?
Issues of power, privilege and fairness are fundamental to my work, which uses information literacy and library services to combat this crisis. I also draw on concepts of media literacy, teaching people to identify biases encoded in media messages and to understand that all information has perspective.
If you could institute regulations on platforms or tech companies right now to stop the spread of disinformation and disinformation, what would it be?
My only move right now to reduce the spread of disinformation and disinformation online would not be technology regulation of my own creation. Effective technology regulations must be based on the needs of the community and the contribution of multiple stakeholders. Instead, I would take immediate action to reform social media education in schools.
I have worked at several high schools in the United States to study student reactions to school-imposed social media education. Almost always, schools – and the state governments that regulate their curricula – translate “social media education” to “Internet safety education.” Unfortunately, that usually means focusing on frightening tactics to show students only the negative sides of using social media while neglecting the educational, social, and emotional benefits. Additionally, the schools I have visited rely heavily on the use of videos to teach internet safety lessons. Scary tactics and non-interactive video for delivering content tend to deter students from learning.
So my first solution would be to reframe Internet safety education in American schools to reduce scare tactics and present a more balanced discussion of the risks and benefits of using the Internet and social media. . And teach students thoughtful decision making about risks and rewards, add additional balanced media education content, and build the content delivery on personal conversations with peers and with members of communities outside of schools to increase student interest, learning outcomes, and community understanding and empathy.
What seem to be the best messages or techniques you’ve found in your research into information literacy in the age of disinformation and disinformation and developing recommendations for teaching information literacy to children and adults? teenagers?
The students I have worked with respond best to peer learning – active discussions with their peers about their shared and unique experiences online – and hands-on learning in the lab, which allows instructors to show real examples of suspicious and harmful online conversations. , as well as sharing best practices for engaging in meaningful online discussion. Finally, building a sense of community and responsible belonging is the key to effective teaching of good digital citizenship.
Some of the key approaches we can take to teach effective information literacy to counter disinformation and disinformation include:
- Explain how information is created and shared in networked environments. Many people do not have a clear understanding of how information is created and shared online, which can lead them to place unwarranted trust in things they read or see online.
- Recognize and believe in the important role of belonging to a cultural and social group in sharing information. We tend to believe information that confirms our existing beliefs or matches our cultural knowledge frameworks. This “confirmation bias” can diminish our critical thinking.
- Framing online discourse conversations about power, privilege and fairness such as:
- integrated into people’s daily information practices
- integrated in the design of algorithms and information systems
- motivation to spread disinformation
- Teach people to consider Who created information and Why it was created before sharing it.
- Encourage people to calm down before sharing emotionally charged or political content. Emotionally charged content tends to be more compelling than facts or statistics, so it’s important to take a few minutes to calm down before you decide to believe or share information.
- Focus on empathy to reduce anger-fueled speech. Often times, if we understand another person’s point of view or motivation, we can engage in more thoughtful and reasonable speech.
For more information and resources on information and media literacy, Agosto recommends:
Journalists interested in speaking with Agosto should contact Britt Faulstick, Deputy Director of Media Relations, [email protected] or 215.895.2617.