Image for post
Image for post

Facebook and other tech companies blame their lack of diversity on the “pipeline problem.” By pipeline, they mean that there are not enough women and people of color taking STEM courses, which will then lead them to STEM internships, graduating with STEM degrees, and then ultimately becoming ideal candidates for STEM positions at their companies. Because of the lack of underrepresented minorities entering the pipeline, the claim is that there are not enough of them in the applicant pool when it comes time to hire.

This is a popular source of blame for Silicon Valley’s dismal diversity numbers. The U.S. tech industry is 8% Latino and 7.4% Black. Black women and Latinas comprise 1% of the Silicon Valley workforce. Of those who are able to break into the industry, people of color are 3.5 times more likely to leave tech than their white counterparts. At Facebook, the percentage of Black and Latino technical employees is 1% and 3% respectively, and that hasn’t changed since they started reporting in 2014. …


Image for post
Image for post

On August 20, 2018, the National Hispanic Media Coalition, along with 18MillionRising.org, Color of Change, and others, submitted public comments to the Federal Trade Commission (FTC) ahead of the public hearings on competition and consumer protection in the 21st century. These comments will aid the FTC in evaluating their law enforcement and policy agenda. Of the eleven topics that organizations were asked to provide input, the filing refers to competition and consumer protection issues in communication, information, and media technology networks, the intersection between privacy, big data, and competition, and the agency’s investigation, enforcement, and remedial processes.

Click here to review the filing. Key points are as…


Image for post
Image for post

Artificial intelligence (AI) has become inescapable in our everyday lives: from our virtual personal assistants Siri and Alexa to AI generated news articles to customer support. We have become so used to coexisting with AI we often don’t think about the design of the algorithms or question the results. Only when things go terribly wrong, such as when Google search for “Latina girls” revealing pornographic results or facial recognition software ignoring black faces, do we start to see the cracks in the code.

AI is the product of people. When only people of the same background have a hand in the design, their same implicit bias will be built right into the code. Facial recognition software from IBM, Microsoft, and Face++ did not work with black faces because the data set used to train the algorithm was overwhelmingly white and male. When the design team is also majority male and white, it isn’t likely that a homogenous dataset would be perceived as an issue. According to the US Equal Employment Opportunity Commision, only 7.4% of the tech workforce is black and 8% is Latino. Without diversity in tech and collaborative teams from different perspectives, we end up with AI that simply continues to perpetuate the same racial bias that exists today. …


Image for post
Image for post

Facebook has had its share of scandals recently. Just within the past few weeks, Facebook was the subject of an undercover report about content moderation in Ireland which led to calls for fines against the social network. Mark Zuckerberg, Facebook’s co-founder and CEO, defended the right of Holocaust deniers to post on the site, a statement that required almost immediate clarification. Leaked documents reveal that Zuckerberg actually congratulated the Trump campaign on its imaginative use of the platform. …


The Allied Media Conference (AMC) was held this year from June 14–17 at Wayne State University in Detroit, Michigan. The conference brings together technologists, podcasters, community activists and organizers, non-profits, and policy think tanks to discuss how advances in technology can improve our communities. Whether from participants come from the West coast, East coast, or somewhere in between, AMC makes the connection between the grass-roots, boots on the ground organizations and the national reach of the larger non-profits. AMC was celebrating its 20th anniversary, marked by its highest attendance of over 3,000 attendees.

Image for post
Image for post

Media Democracy Fund and summer fellows at…


Image for post
Image for post

For the past four years, the Chicago Police Department has been working with researchers to build a system for judging which city residents are most likely to be involved in a shooting — either pulling the trigger, or getting shot. The resulting “heat list” — officially called the Strategic Subjects List (SSL) — has, for the most part, been shrouded in secrecy and speculation. What we’ve known is that everyone on the list gets a risk score, reflecting their predicted likelihood of being involved in a shooting.

The list is, to our knowledge, the highest-profile person-based predictive policing system in use across the United States. Perhaps that’s why it has attracted significant press attention — often including overstated comparisons to Minority Report — even though little is known about how it works. Most predictive policing systems fielded by major U.S police departments today are “place-based,” meaning they attempt to forecast when and where future crime may occur. …

About

Brianna Posadas

Media Democracy Fund PhDx Fellow at National Hispanic Media Coalition

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store