Being part of the majority of white male programmers out there, it might be hard to realize the sexism and racism inherent in the output of the machine learning algorithms. Garbage in-garbage out is the cause of this as hundreds of years of inequality feeds into the training datasets of machine learning algorithms as web pages, books and essays are crawled. Therefore, it is not a big surprise when ProPublica reported on the bias that the software court systems were using to predict future criminal for sentencing decisions. I just read about sexism inherent in word association algorithm in an article on NPR. It looks like they are working on trying to define a threshold of what's okay to be gender-specific and what's adding bias and sexism into the word association. With more diversity in workplace, maybe we will be able to find solutions to this bias. The challenge of a vicious cycle still remains though: if algorithms push female and/or non-white resumes to the bottom of the pack, it may be hard to create that diversity.