Program

The student retreat is taking place physically from the 10th to 13th of November 2021. On Wednesday is the arrival and on Saturday the departure. Two full days are planned for working together. For a better dynamic we ask you to come the 4 days and stay 3 nights.

Our main goal is to share knowledge, set the tracks for new research venues and raise awareness about developing practices producing bias.

Organisation

Three invited experts will present their work focusing on key concepts, methods and theories to deconstruct and analyse practical cases during the retreat. The presentations are supported by the references you need to read in advance.

After the presentations, we will favor a collective learning process based on a flipped classroom approach where participants have the lead to raise issues, concerns, reflections and questions on the topic.

Later, we’ll form 3 groups with 3 to 4 members and 1 expert working in a specific question. You decide together on which question you would like to work on. After working in groups, a collective discussion will be held, in order to share, confront the outcomes and provide mutual feedback. A daily summary will be written with the findings, gaps and personal and collective reflections, in order to prepare the final research agenda.

We also planned free time for informal exchanges, talking and finding common interests, eating tasty local food, and an outdoor hiking in the Jura. Resting is also part of the program!

Experts that will be working with you

Prof. Jenna Burrell / UC Berkeley

Bio: “I am the co-director of the Algorithmic Fairness and Opacity Working Group. My first book Invisible Users: Youth in the Internet Cafes of Urban Ghana (The MIT Press) came out in May 2012. I am currently working on a second book about rural communities that host critical Internet infrastructure such as fiber optic cables and data centers. My research focuses on how marginalized communities adapt digital technologies to meet their needs and to pursue their goals and ideals.” https://www.ischool.berkeley.edu/people/jenna-burrell

Dr. Margarita Boenig-Liptsin / Paris Institute for Advanced Study (IAS)

Bio: Margarita (Margo) Boenig-Liptsin is a Research Associate in the Division of Computing, Data Science and Society at the University of California, Berkeley. Prior to the Fellowship at the Paris IAS, she was the Director of the Human Contexts and Ethics Program in the Division and Lecturer in the History Department. She teaches about the relationship between technology, power, democracy, and ethics with a foundation in Science, Technology, and Society (STS) to students in both technical and social science/humanities fields. At Berkeley, she leads a team that develops the Human Contexts and Ethics (HCE) component of the data science educational program. This work includes translating social science theory and methods for students of all backgrounds, building connections between engineering and social sciences, bringing together faculty, graduate students, and undergraduates at Berkeley into a community of scholars focused on issues of technology and society, and connecting the work of HCE to the San Francisco Bay Area’s communities and needs. Margarita Boenig-Liptsin is trained in the field of Science, Technology and Society and holds a PhD in History of Science (Harvard University) and in Philosophy (Université Paris-Sorbonne). Her research examines transformations to human identity and citizenship in relation to information technologies across time and cultures. https://www.paris-iea.fr/en/fellows/margarita-boenig-liptsin-2

Dr. Isabel Ebert / University St Gallen (HSG)

Dr. Ebert is a Senior Research Fellow at the Institute for Business Ethics, University St Gallen (HSG). Her main focuses are Business & Human Rights, Digital Rights, Workplace monitoring & Privacy, Future of Work, Fairness, Accountability & Transparency in Artificial Intelligence, Machine Learning, Data Models, Rohstoffmärkte, International Governance & Business Ethics, Politics, in particular Peace & Conflict Research and Private Sector Governance. https://iwe.unisg.ch/en/personenverzeichnis/aeb67bcd-9090-4881-b9bc-ff004804d389

Dr. Kebene Kejela Wodajo / University St Gallen (HSG)

Dr. Kebene Wodajo is a post-doctoral researcher at the Institute for Business Ethics, University of St.Gallen. Prior to taking up her current position, Wodajo has been a lecture and visiting lecturer at Ambo University and Addis Ababa University School of Law. Her current research is entitled “Regulating Structural Injustice in the Digital Space” and explores responsibility for structural and intersectional injustices that are channeled through digital technologies: https://iwe.unisg.ch/en/initiativen/competence-center-for-african-research/members-and-projects/kebene-wodajo. Her research fields are: Business & Human Rights/Business Ethics, technology and Africa, Economic analysis of law.

Schedule

Wed 10 Nov 2021 Thur 11 Nov 2021 Fri 12 Nov 2021 Sat 13 Nov 2021
9:00 Welcome and Breakfast Breakfast
10:00 Prof. Burrell’s public talk at the EPFL Introduction Dr. Ebert’s and Wodajo’s Presentation Breakfast
10:15 Prof. Burrell’s public talk at the EPFL Collective Presentation
10:45 Prof. Burrell’s public talk at the EPFL Prof. Burrell’s Presentation Groups’ Organization & Topic Definition Wrap-up / Conclusion
11:45 Discussion with Prof. Burrell at the EPFL Dr. Boenig-Liptsin’s Presentation Practical work in groups with Prof. Burrell / Dr. Ebert / Dr. Boenig-Liptsin / Dr. Wodajo Departure
12:15 Discussion and aperitive at the EPFL Collective Discussion Lunch
13:00 Lunch Outdoor hicking
14:00 Flipped Classroom Dynamic: Discussions with Prof. Burrell / Dr. Boenig-Liptsin / Participants Practical work in groups with Prof. Burrell / Dr. Ebert / Dr. Boenig-Liptsin / Dr. Wodajo
15:00
16:30 Arrival Farm visit Groups’ Result Presentation
17:00 Collective Feedback
18:00
19:00 Dinner Dinner Dinner & Music

Group objectives

  • State of the art
  • Defining types of biases (human, algorithmic, gender, race…) and what is not a bias
  • Identifying use cases, theoretical concepts and methods for analysing algorithms
  • Critical thinking
  • Reflecting on developing practices and processes

Outputs

The final research agenda will focus on algorithmic bias identified through different stages of projects' development and on practical cases (platforms mapped and analysed) The document will be available in a collaborative wiki page and the retreat’s website.
Also, the written reports about the presentations and discussions will be published in the wiki page so that every participant has the possibility to extend the information.

References

**If you do not have access to the articles, we will send them to you by email after your registration. Some of the readings below are mandatory. Please read them before the retreat to have a common background.

Mandatory and Optional Readings

Prof. Burrell’s references: CONCEPTS: autonomy, control, fairness, domination MANDATORY: Burrell, Jenna, Zoe Kahn, Anne Jonas, and Daniel Griffin. “When Users Control the Algorithms: Values Expressed in Practices on Twitter.” Proceedings of the ACM on Human-Computer Interaction 3, no. CSCW (November 7, 2019): 1–20. https://doi.org/10.1145/3359240.

MANDATORY: Petre, Caitlin, Brooke Erin Duffy, and Emily Hund. “‘Gaming the System’: Platform Paternalism and the Politics of Algorithmic Visibility.” Social Media + Society 5, no. 4 (October 2019): 205630511987999. https://doi.org/10.1177/2056305119879995.

MANDATORY: Young, Iris Marion. “Taking the Basic Structure Seriously.” Perspectives on Politics 4, no. 01 (March 2006). https://doi.org/10.1017/S1537592706060099.

OPTIONAL: Burrell, Jenna, and Marion Fourcade. “The Society of Algorithms.” Annual Review of Sociology 47, no. 1 (July 31, 2021): 213–37. https://doi.org/10.1146/annurev-soc-090820-020800.

OPTIONAL: Burrell, Jenna. “How the Machine ‘Thinks’: Understanding Opacity in Machine Learning Algorithms.” Big Data & Society 3, no. 1 (January 5, 2016): 1–12. https://doi.org/10.1177/2053951715622512.

Dr Boenig-Liptsin’s references: Concepts: relational ethics, co-production, sociotechnical imaginaries

MANDATORY: Boenig-Liptsin, Margarita “Aiming at the Good Life in the Datafied World: A co-productionist framework for ethics”

MANDATORY: Williams, Patricia. Foundations of Responsible Computer Science (FORC) Keynote, June 3, 2020. https://www.youtube.com/watch?v=5DXRS_eHs6A&t=828s

MANDATORY: Jasanoff, Sheila. “Future Imperfect: Science, Technology and the Imagination of Modernity,” in S. Jasanoff and S. Kim (Eds.), Dreamscapes of Modernity: Sociotechnical Imaginaries and the Fabrication of Power, Chicago, IL: University of Chicago Press, 2015.

Dr Ebert’s and Dr Wodajo’s references: Method: mixed method, qualitative doctrinal and selected case discussions. Conceptual focus will be a human rights perspective to digital ecosystem and structural injustice in cyberspace (with perspective from Global South epistemology)

MANDATORY: Berhane, Abeba, Algorithmic injustice: a relational ethics approach, Patterns (N Y). 2021 Feb 12; 2(2): 100205. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7892355/

Mhlambi, Sabelo, From Rationality to Relationality: Ubuntu as an Ethical and Human Rights Framework for Artificial Intelligence Governance, Carr Center Discussion Paper 2020-009, https://carrcenter.hks.harvard.edu/publications/rationality-relationality-ubuntu-ethical-and-human-rights-framework-artificial

Wodajo, Kebene and Ebert, Isabel, Reimagining Corporate Responsibility for Structural (In)justice in the Digital Ecosystem: A Perspective from African Ethics of Duty, Oct. 15, 2021, https://www.afronomicslaw.org/category/analysis/reimagining-corporate-responsibility-structural-injustice-digital-ecosystem

MANDATORY: Arora P. Decolonizing Privacy Studies. Television & New Media. 2019;20(4):366-378 https://journals.sagepub.com/doi/abs/10.1177/1527476418806092

Ebert, Isabel Laura; Wildhaber, Isabelle & Adams-Prassl, Jeremias (2021) Big Data in the workplace: Privacy Due Diligence as a human rights-based approach to employee privacy protection. Big Data & Society, https://www.alexandria.unisg.ch/263173/2/Big%20Data%20Society%20EBERT%20et%20al%202021%20Privacy%20Due%20Diligence.pdf

Ebert, Isabel Laura Ethical Managerial Practice in Dealing with Government Data Requests. Zeitschrift für Wirtschafts- und Unternehmensethik, Jahrgang 20 (2019), Heft 2 264-275. ISSN 1439-880X https://www.alexandria.unisg.ch/258672/1/Tech%20company%20dilemma%20ZfWU%20final.pdf

Schafheitle, Simon Daniel; Weibel, Antoinette; Ebert, Isabel Laura; Kasper, Gabriel; Schank, Christoph & Leicht-Deobald, Ulrich (2020) No stone left unturned? Towards a framework for the impact of datafication technologies on organizational control. Academy of Management Discoveries, 6 (3). 455-487 https://www.alexandria.unisg.ch/260238/1/Schafheitle_etal_TechnologyControl.pdf

##Additional material Basics: Gran, Anne-Britt, Peter Booth, and Taina Bucher. “To Be or Not to Be Algorithm Aware: A Question of a New Digital Divide?” Information, Communication & Society, March 9, 2020, 1–18. https://doi.org/10.1080/1369118X.2020.1736124.

Lewis, Kevin. “Three Fallacies of Digital Footprints.” Big Data & Society 2, no. 2 (December 27, 2015): 205395171560249. https://doi.org/10.1177/2053951715602496.

Pasquinelli, Matteo. “HOW A MACHINE LEARNS AND FAILS – A GRAMMAR OF ERROR FOR ARTIFICIAL INTELLIGENCE.” Spheres 5 (2019): 17.

Wieringa, Maranke. “What to Account for When Accounting for Algorithms: A Systematic Literature Review on Algorithmic Accountability.” In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, 1–18. Barcelona, Spain: Association for Computing Machinery, 2020. https://doi.org/10.1145/3351095.3372833.

Methodologies qualitative and mixed-methods: Bivens, Rena, and Anna Shah Hoque. “Programming Sex, Gender, and Sexuality: Infrastructural Failures in ‘Feminist’ Dating App Bumble.” Canadian Journal of Communication 43, no. 3 (August 13, 2018): 441–59. https://doi.org/10.22230/cjc.2019v44n3a3375.

MacLeod, Caitlin, and Victoria McArthur. “The Construction of Gender in Dating Apps: An Interface Analysis of Tinder and Bumble.” Feminist Media Studies 19, no. 6 (August 18, 2019): 822–40. https://doi.org/10.1080/14680777.2018.1494618.

Olgado, Benedict S., Lucy Pei, and Roderic Crooks. “Determining the Extractive Casting Mold of Intimate Platforms through Document Theory.” In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–10. Honolulu, HI, USA: Association for Computing Machinery, 2020. https://doi.org/10.1145/3313831.3376850.

Pidoux, Jessica, Pascale Kuntz, and Daniel Gatica-Perez. “Declarative Variables in Online Dating: A Mixed-Method Analysis of a Mimetic-Distinctive Mechanism” 5, no. CSCW1 (April 2021): 100–132. https://doi.org/https://doi.org/10.1145/3449174.

Ziewitz, Malte. “A Not Quite Random Walk: Experimenting with the Ethnomethods of the Algorithm.” Big Data & Society 4, no. 2 (December 2017): 1–13. https://doi.org/10.1177/2053951717738105.

Dating app algorithms:
Bapna, Ravi, Jui Ramaprasad, Galit Shmueli, and Akhmed Umyarov. “One-Way Mirrors in Online Dating: A Randomized Field Experiment.” Management Science 62, no. 11 (February 2, 2016): 3100–3122. https://doi.org/10.1287/mnsc.2015.2301.

Hutson, Jevan A., Jessie G. Taft, Solon Barocas, and Karen Levy. “Debiasing Desire: Addressing Bias & Discrimination on Intimate Platforms.” Proceedings of the ACM on Human-Computer Interaction 2, no. CSCW (November 1, 2018): 1–18. https://doi.org/10.1145/3274342.

Sharabi, Liesel L. “Exploring How Beliefs About Algorithms Shape (Offline) Success in Online Dating: A Two-Wave Longitudinal Investigation.” Communication Research, January 20, 2020. https://doi.org/10.1177/0093650219896936.

Wang, Shuaishuai. “Calculating Dating Goals: Data Gaming and Algorithmic Sociality on Blued, a Chinese Gay Dating App.” Information, Communication & Society 23, no. 2 (January 28, 2020): 181–97. https://doi.org/10.1080/1369118X.2018.1490796.