top of page

Route Home

There are a many apps that claim to navigate users through cities prioritizing “safety”. Some use crowd sourced data, reports of “dangerous” or “sketchy spots” to help walkers avoid unsafe paths, more often they use crime data, collected by municipalities. Relying on subjective metrics like “sketchiness” can result in a racist or classist data sets. Crime data taken uncritically has these same biases baked in, at an institutional level. At the same time- these apps make implicit assumptions about what might make a person feel safe. Fewer drug busts, more police officers, white neighborhoods with well-lit streets and expensive houses: this set of parameters does not represent safety for all Americans, and for some could represent added danger. Still most safe navigation apps route people out of poorer neighborhoods into richer ones and out of black and brown neighborhoods into whiter ones.

Route Home was a speculative piece that attempted to examine how these biases live in our data sets and algorithms by providing more choice, freedom and visibility in building a personal definition of safe. Route Home was a class project built by Babson, Olin and Wellesley students for a class in software design. Users could filter for specific types of incidents, pulled from Boston crime and traffic date, like assault, robbery or vehicle collisions and see how these affected their proposed walking routs.

Future versions of Route Home might incorporate, lighting data, crowd sourced/ real time police spotting and traffic incidents, visualization of demographic information and more. We wonder whether adding these would allow for a more critical eye toward bias in our data, algorithms, law enforcement and language or just more overt and informed racism.

 

https://github.com/christinagee/Safe-Route-Home

bottom of page