Menu Close

Sunshine as antiseptic – Groups pays for coverage of opaque AI

Organizers of an AI research and advocacy group in Berlin say the world is still too passive about artificial intelligence. In a bid to change that trajectory, AlgorithmWatch has named fellowships for reporting on algorithmic accountability.

Individuals, groups, companies and the occasional national government are working to shed light on the automatic decision making, but most of the code is based on black-box algorithms and the efforts are diffuse. That opaqueness hinders public knowledge and trust as well as efforts to regulate codes.

AlgorithmWatch officials want to address part of that problem by naming six people (the original number was five) to six-month, €7,200 ($7,647) fellowships.

They want the program to result in more reporting on AI – the algorithms, not just the topic — deployed specifically in the European Union. As part of the reporting, the organizers want coverage of people whose lives are affected now by the codes.

One of AlgorithmWatch’s organizing principles is that societies cannot leave their futures up to AI and the crafters of AI.

The six fellows are:

Naiara Bellio, a digital rights reporter who will investigate the indiscriminate use of AI by governments, including in Spain.

Pierluigi Bizzini, a journalist with a computer science background who has covered the social implications of automated systems on the rights of migrants, the indigent and minorities.

Nathalie Koubayová, a PhD student fascinated with the social implications of chatbots. She plans to use her fellowship to examine the use of chatbots in mental health, ag tech and automated fact-checking.

Jennifer Krueckeberg, who has a recent PhD in anthropology that examined the effect digital media has on young people’s personal memory practices. She plans to use her fellowship to look at how AI affects surveillance, education and daily lives, which could potentially include biometric algorithms.

Kave Noori, a human rights lawyer who will examine the views of people with hearing disabilities on the ethics of robot interpreters.

Sonja Peteranderl, a journalist who has written for Der Spiegel and Wired Germany, who will look into AI’s effect on the visibility of marginalized communities. Organizers of an AI research and advocacy group in Berlin say the world is still too passive about artificial intelligence. In a bid to change that trajectory, AlgorithmWatch has named fellowships for reporting on algorithmic accountability.

Individuals, groups, companies and the occasional national government are working to shed light on the automatic decision making, but most of the code is based on black-box algorithms and the efforts are diffuse. That opaqueness hinders public knowledge and trust as well as efforts to regulate codes.

AlgorithmWatch officials want to address part of that problem by naming six people (the original number was five) to six-month, €7,200 ($7,647) fellowships.

They want the program to result in more reporting on AI – the algorithms, not just the topic — deployed specifically in the European Union. As part of the reporting, the organizers want coverage of people whose lives are affected now by the codes.

One of AlgorithmWatch’s organizing principles is that societies cannot leave their futures up to AI and the crafters of AI.

The six fellows are:

Naiara Bellio, a digital rights reporter who will investigate the indiscriminate use of AI by governments, including in Spain.

Pierluigi Bizzini, a journalist with a computer science background who has covered the social implications of automated systems on the rights of migrants, the indigent and minorities.

Nathalie Koubayová, a PhD student fascinated with the social implications of chatbots. She plans to use her fellowship to examine the use of chatbots in mental health, ag tech and automated fact-checking.

Jennifer Krueckeberg, who has a recent PhD in anthropology that examined the effect digital media has on young people’s personal memory practices. She plans to use her fellowship to look at how AI affects surveillance, education and daily lives, which could potentially include biometric algorithms.

Kave Noori, a human rights lawyer who will examine the views of people with hearing disabilities on the ethics of robot interpreters.

Sonja Peteranderl, a journalist who has written for Der Spiegel and Wired Germany, who will look into AI’s effect on the visibility of marginalized communities.  Read More   

Generated by Feedzy

Disclaimer

Innov8 is owned and operated by Rolling Rock Ventures. The information on this website is for general information purposes only. Any information obtained from this website should be reviewed with appropriate parties if there is any concern about the details reported herein. Innov8 is not responsible for its contents, accuracies, and any inaccuracies. Nothing on this site should be construed as professional advice for any individual or situation. This website includes information and content from external sites that is attributed accordingly and is not the intellectual property of Innov8. All feeds ("RSS Feed") and/or their contents contain material which is derived in whole or in part from material supplied by third parties and is protected by national and international copyright and trademark laws. The Site processes all information automatically using automated software without any human intervention or screening. Therefore, the Site is not responsible for any (part) of this content. The copyright of the feeds', including pictures and graphics, and its content belongs to its author or publisher.  Views and statements expressed in the content do not necessarily reflect those of Innov8 or its staff. Care and due diligence has been taken to maintain the accuracy of the information provided on this website. However, neither Innov8 nor the owners, attorneys, management, editorial team or any writers or employees are responsible for its content, errors or any consequences arising from use of the information provided on this website. The Site may modify, suspend, or discontinue any aspect of the RSS Feed at any time, including, without limitation, the availability of any Site content.  The User agrees that all RSS Feeds and news articles are for personal use only and that the User may not resell, lease, license, assign, redistribute or otherwise transfer any portion of the RSS Feed without attribution to the Site and to its originating author. The Site does not represent or warrant that every action taken with regard to your account and related activities in connection with the RSS Feed, including, without limitation, the Site Content, will be lawful in any particular jurisdiction. It is incumbent upon the user to know the laws that pertain to you in your jurisdiction and act lawfully at all times when using the RSS Feed, including, without limitation, the Site Content.  

Close Bitnami banner
Bitnami