Aaron Clauset of the University of Colorado, Boulder (and Haverford College class of 2001) just published an article suggesting evidence of bias in faculty hiring, especially in computer science and especially with women.
Friday, February 13, 2015
Sunday, January 18, 2015
So I saw this recent article about not teaching all women to code. Or more accurately, the title is provocative. As I read the piece, there are some ideas I agree with, like we should never coerce students to learn coding or anything for that matter. But I do believe we should expose them to computational thinking, and that will include algorithms, data representations .. in other words, programming.
I am sensing that the issue is more with the targeted accommodations for women. Studies indicate that is where the issue is. Still, the dream should be less accommodation and more universal design of education that includes everyone. In this way, women, like everyone else, can make a more informed choice about career and interest.
But I am considering a (sarcastic) response about not teaching men cooking, or some other provocative title. Stay tuned.
Tuesday, November 18, 2014
UPDATE: The SIGCSE/CSTA communities have rallied and developed some great responses to the original book, a few highlighted here:
- Feminist Hacker Barbie where you can rewrite the book
- An example remixed by Casey Fiesler
- An article at Cosmo
- The original post by Pamela Ribon
There is a new book about Barbie and computer engineering, I think Mattel needs to get some feedback, it portrays a young girl only able to design a computer game and needing boys to implement. Marie desJardins contacted the author directly and shared the reply to SIGCSE, copied below:
Thank you for your email. I am grateful that you have pointed this out to me. When I write Barbie stories, I always try to write them from a feminist perspective. The story of "Barbie Computer Engineer" was an assignment I got that had to be based on an existing Italian Barbie magazine story. My assignment was to rewrite the story for a book format. I never saw a final copy (I am just a lowly freelance writer, they don't send me copies). I will order a copy and see what exactly I wrote that is upsetting people. While I take responsibility for what I wrote, you should be aware that I was obliged to follow the existing story and I do not know how Mattel changed the story after I wrote it.
I welcome the twitter controversy and I should have perhaps seen this and pushed with the editors to make the story better in terms of the way it portrays woman. I think Mattel should be more responsible towards the young girls affected by their content and I should too. I consider myself a feminist and have worked for many feminist causes so I was surprised by your email. Sometimes as a freelance writer you get lazy and just follow orders and forget to think about the young people you are affecting. Thank you for reminding me.
all the best,
I suppose it is a sincere response, but the damage is still there -- onto contacting Mattel directly? Stay tuned!
Saturday, November 15, 2014
Saturday, September 6, 2014
Tuesday, August 19, 2014
Updated Sunday 24 August ...
Found this from a friend, read through the comments, still work to do :-(
Monday, June 9, 2014
Recent findings under review at Psychological Science "... suggest that a more balanced division of household labor among parents might promote greater workforce equality in future generations."
I am so OK with that, I like watching TV while folding/ironing and listening to the radio in the kitchen -- just can't wait for my wife to find out she needs to mow the lawn and repair the car (oh, wait she's already done each of these tasks :).
Mom was right, actions do speak louder than words.
Thursday, April 10, 2014
I think this is the most troubling question posed in the blog post: "What is the difference between the pattern recognition afforded by big data, and profiling on the basis of gender, race or class?"
My initial response involves how we use this profiling/pattern recognition information -- computer programs recognize, people profile (whether they admit it or not). Our choice is how we use this information, and it seems we need to tread with caution.