The information landscape has undergone dramatic changes with the expansion of the internet and online social networks. Optimistic views thought that online communication would foster a culture of participation. However, recent events suggest that social media platforms limit the diversity of content by exposing users to pre-existing beliefs, a phenomenon known as echo chambers. In addition, users with malicious intent are using these platforms to deceive people and discredit the democratic process. To better understand these two phenomena, this chapter describes a computational method to analyze coordinated inauthentic behavior on Facebook groups on posts, URLs, and images. Our findings suggest that Facebook groups share identical items almost simultaneously by different entities. In doing so, we could identify that these groups resemble disinformation echo chambers, where repeatedly sharing activities of disinformation narratives occur. The chapter concludes with theoretical and empirical implications.
If you use this code in your research, please cite it using the following DOI:
You can use the provided Python script "cib.py" to analyze coordinated inauthentic behavior on Facebook. Ensure you have the necessary dependencies installed. You can run the script as follows:
python3 cib.py
This project is licensed under the GPL License. See the LICENSE file for more information. Contact
This project was partially funded by the University of Amsterdam’s RPA Human(e) AI and by the European Union’s Horizon 2020 research and innovation program No 951911 (AI4Media).
For any inquiries or feedback, please contact the project maintainer: Wilson Ceron.