· 2017
We go through life some of us with strife many times until the better end We never know how far we'll go so don't stop until you win! In life we encounter so many different obstacles. They often seem too heavy a burden to carry; only to later learn a lesson that is far more valuable than we'd ever imagine. Is life hard? Are we complicating things? In Profound Thoughts, author Rashida Richardson recommends ways to get through some of life's toughest moments. Remaining positive during what seems to be the worse situation ever is the best gift you can give yourself. Ms. Rashida has created this bundle of positivity for you to use in moments of distress.
No image available
· 2019
Transparency and accountability are both tools to promote fair algorithmic decisions by providing the foundations for obtaining recourse to meaningful explanation, correction, or ways to ascertain faults that could bring about compensatory processes. The study develops policy options for the governance of algorithmic transparency and accountability, based on an analysis of the social, technical and regulatory challenges posed by algorithmic systems. Based on an extensive review and analysis of existing proposals for governance of algorithmic systems, the authors propose a set of four policy options each of which addresses a different aspect of algorithmic transparency and accountability. 1. Awareness raising: education, watchdogs and whistleblowers. 2. Accountability in public sector use of algorithmic decision-making. 3. Regulatory oversight and Legal liability. 4. Global coordination for algorithmic governance.
No image available
No image available
· 2021
This Essay asserts that in the United States racial segregation has and continues to play a central evolutionary role in the inequalities we see reproduced and amplified by data-driven technologies and applications. Racial segregation distorts and constrains how data-driven technologies are developed and implemented, how problems of algorithmic bias are conceptualized, and what interventions or solutions are deemed appropriate and pursued. After detailing the foundational aspects of how racial segregation has evolved over time and its less obvious social, political and epistemic implications for White Americans, the demographic group that dominates the technology sector, this Essay explores how racial segregation affects algorithmic design, analysis and outcomes. It concludes with analysis of how prevailing approaches to evaluating and mitigating algorithmic bias are insufficient and why a transformative justice framework is necessary to adequately examine and redress algorithmic bias as well as improve the development of data-driven technologies and applications. This Essay illustrates how critical analysis of racial segregation can deepen our understanding of algorithmic bias, improve evaluations of data-driven technologies for social and racial equity concerns, and broaden our imaginations about what meaningful redress of technology-mediated harms and injustices should include.
No image available
· 2018
As the pervasiveness, complexity, and scale of these systems grow, the lack of meaningful accountability and oversight – including basic safeguards of responsibility, liability, and due process – is an increasingly urgent concern. Building on our 2016 and 2017 reports, this report contends with this central problem and addresses the following key issues: 1. The growing accountability gap in AI, which favors those who create and deploy these technologies at the expense of those most affected. 2. The use of AI to maximize and amplify surveillance, especially in conjunction with facial and affect recognition, increasing the potential for centralized control and oppression. 3. Increasing government use of automated decision systems that directly impact individuals and communities without established accountability structures. 4. Unregulated and unmonitored forms of AI experimentation on human populations. 5. The limits of technological solutions to problems of fairness, bias, and discrimination.