Thanks to Mark Guzdial for pointing this out to me via his great computing education blog, but this post gave me pause -- we intend (or at least I intend) for technology and computing to provide more access, and thus more diversity, but might the tool(s) "bake in" (or exacerbate?) existing biases and inequities?
I think this is the most troubling question posed in the blog post: "What is the difference between the pattern recognition afforded by big data, and profiling on the basis of gender, race or class?"
My initial response involves how we use this profiling/pattern recognition information -- computer programs recognize, people profile (whether they admit it or not). Our choice is how we use this information, and it seems we need to tread with caution.