Friday 30 December 2022

How Product Design Was Transformed By 2020

https://www.youtube.com/embed/w6q9obdfe3c


It may be hard to believe that products based on racial stereotypes survived well into the 21st century. There was always these friends I had breakfast with and they always would take out their Aunt Jemima jug. I haven't really heard of any other syrups, so I usually just gravitate towards that one. It just seems like a super exploitation of a stereotype. What do we want? Justice! And when do we want it? Now! It took widespread protests in 2020 to send some companies a wake-up call.


But even as some of this controversial branding slowly disappears off store shelves, there could be an even more insidious and harder to fix form of racial bias coded into the technologies that have become ubiquitous in everyday life. Start your day with natural freshness from Darlie. One of the most popular toothpaste brands in East Asia, Southeast Asia, is called Darlie Toothpaste. Darlie is owned by Colgate-Palmolive and a local partner, and has a long history, goes back to, I think, 1930s. And it originally started out as Darkie toothpaste.


The packaging was pretty explicit. It had a picture of minstrel singer in blackface on the package with the big smile with white teeth. This came about at a time when movies like "The Jazz Singer" with Al Jolson were very popular. ♪ Smile, Darkie, smile for me ♪ Darkie Toothpaste gives you a cool, fresh, tingling taste. You see how this is working in terms of marketing. There's nothing wrong with wanting white teeth, but when you contrast that with this jet-black person, the message is very clear. If you don't want to have dark teeth, you don't want anything dark, use Darkie. It will solve that problem. They changed the name in English to Darlie, so one letter gets rid of the problem. They also got rid of the packaging, so it was no longer just a picture of a man in blackface. On the box, it was somebody who was in a top hat, and there was a black and white contrast, still getting the point across, like, look "black, white," but it wasn't blackface.


That said, though, they didn't change the name in Chinese. The issue of having these kinds of brands with these racist legacies, it's a bit more complicated here in Asia than it is in other parts of the world, in part, just because of the different racial dynamics in Asia. You don't have a lot of Black people. And so, people just don't think about these issues the same way. So there's less pressure domestically to do anything about them. Meanwhile, across the world, the United States has its own version of the Darlie story, in a product familiar to many people on their breakfast table.


There were two Black families in the town I grew up in. And Aunt Jemima was literally the only Black person I knew most of my life. We used to use it a lot when we were kids and making pancakes in the morning and using the syrup too. I had good memories from my childhood, using it all the time. Well, I don't necessarily think Aunt Jemima was racist. I felt like it was just a brand that named it by that, and nothing more than that really. At the time of Aunt Jemima's inception, 1889, this was something where it was marketed to White folks. You have this very recent memory back then of just slavery, what it meant to people and what subservience meant in terms of not only a proposition but a role in the American way. The gentleman who came up with the idea for the Aunt Jemima motif and brand, Chris Rutt, he attended a minstrel show in Missouri in the 1880s, where he saw a White man in blackface performing a minstrel role as a mammy to target and make fun of African Americans, but also African American women. And it's out of that performance that Rutt got together with his collaborator, a guy by the name of Underwood, to come up with this marketing strategy for the Pearl Milling Company's brand of self-raising flour.


In the 1920s, Quaker Oats bought Pearl Milling Company, increasing the market share and cultural presence of the brand. Around the same time, mob violence and new forms of repression against African Americans were becoming more widespread. In the small Missouri town where the milling company behind Aunt Jemima's was based, a 19-year-old Black man was lynched in 1933 after he was accused of making a pass at a White woman. That's the immediate context, because if you think about it, in comparison to the sort of brutal lynching of a young African American man, Aunt Jemima's something very different.


It's a contrast to that sort of threat of Black masculinity. Aunt Jemima is the embodiment of the sort of safe, loyal, trusting house servant of slavery times. So that imagery of Aunt Jemima was both reassuring and nostalgic to White Americans, and particularly White Southerners. Nostalgia is an important emotion. People buy products on emotion. For generations, frosty mornings have seemed warmer with stacks of... Aunt Jemima Buckwheats. Quaker tweaked the brand image over time, replacing the kerchief on the Aunt Jemima character's head with a plaid hairband in 1968. And again, in 1989, by adding pearl earrings, a lace collar and then-fashionable perm hairstyle. It's got more maple in it. In the early '90s, a commercial campaign starring performer Gladys Knight tried to further normalize Aunt Jemima by showing how much African Americans themselves loved the product.


But even after the brand was acquired by PepsiCo in 2001, some still felt that the minor branding shifts over the years were mostly a convenient corporate distraction from a lingering racial bias. Aunt Jemima was a very strong brand, strong enough where it was never discontinued. And when we look at these products, the ethos of them and just being easy, ready-made products, that was the whole design.


To bring in these caricatures, to convey this ease as if you have this person, "Mammy," in your kitchen, cooking the pancakes for you. It's that easy that it's as if someone else did it for you. So racism is commerce, and in terms of this marketing, unfortunately it was hinged on a lot of this. Aunt Jemima buttermilk pancakes. Perfect pancakes in 10 shakes. This is a classic example of a corporation commercializing and capitalizing on racial stereotypes, and making in the process billions and billions of dollars in profit. Aunt Jemima is history. Quaker Oats announcing today that it is retiring both the logo and the name of the 130-year-old brand. Quaker, the latest company to take action as demonstrations protesting racial inequality continue across the country.


PepsiCo has recognized that we are in one of those moments, and that's to not lose market share, and to seem as though it's continuing to be relevant, they need to make this attempt. And so, they've done that with the Pearl Milling Company branding. Now the Pearl Milling Company branding sort of harken backs to the very beginning of this in St. Joseph, Missouri, that era in relation to racial violence. I don't know that it goes far enough to separate it from that racist and oppressive past that so many African American people were exposed to. While Aunt Jemima was probably the most notorious mainstream brand to face a reckoning after the 2020 protests, a few others, such as Land O'Lakes and Uncle Ben's, have also come under scrutiny in the wake of the renewed conversations around systematic racism. As corporations continue to navigate how to address racial stereotypes in advertising while still protecting profits, there's a whole other arena now where the problem of embedded racism may be even harder to detect and harder to stamp out.


pexels-photo-1038043.jpeg?auto=compress&cs=tinysrgb&h=650&w=940


I think my blackness is interfering with the computer's ability to follow me. As you can see, I do this, no following. Not really, not really following me. I back up, I get really, really close to try to let the camera recognize me, not happening. Now, my White coworker, Wanda, is about to slide in the frame, you'll immediately see what I'm talking about. Wanda, if you would, please. Sure. It stems from the roots, and AI is just the next branch.


Everything is the exact same when it comes to the application, the technology behind it. Because when you look at who's in the room, it's not Black people who are testing these AI algorithms. The system I was using worked well on my lighter skin friend's face, but when it came to detecting my face, it didn't do so well, until I put on a white mask. There's a really groundbreaking piece of research that I think many people sort of associate with this concern and movement around AI fairness and bias, and that's the 2018 research done by Joy Buolamwini and Timnit Gebru. They were trying to figure out whether popular facial-recognition programs could correctly pick out the gender of different photographs. And what they found was that the facial-recognition software that they looked at, which was from Microsoft, IBM, and Face++, misidentified darker-skinned women far more frequently.


And a few months later, Buolamwini along with Deborah Raji, published a subsequent paper that looked at Amazon's cloud facial-recognition program, which is called Rekognition. They were already pitching it for use by police and by immigration services, and we pretty much said, "Well, these products are already out there in the market. Do they actually work for faces that look like ours? Darker, female faces." What we found was that these models were performing at less than 70% accuracy for darker female faces, while performing at pretty much 100% accuracy for lighter male faces, revealing a discrepancy in the performance of these systems for different demographic groups, and in particular, sort of endangering the lives of those that would be impacted by the deployment of these systems.


Eventually Microsoft fixed its facial-recognition software, and big tech companies, including Amazon and IBM, said they would stop selling facial-recognition products to the police. Some local governments have also passed laws regulating facial-recognition software, or banning its use in policing, but there's no federal law regulating its use. At the same time, there are other companies that sell their facial-recognition software to police departments. And there've been at least three cases of Black men who are suing because they say that they were wrongly flagged by this facial-recognition software. Police came after them because the software told them that they were a suspect they were looking for. This is what happened in the case of Robert Williams, who was one of the first recorded cases of a false facial-recognition match escalating to a false arrest. I picked that paper up and hold it next to my face. And I said, "This is not me." I was like, "I hope you all don't think all Black people look alike." And then he says, "The computer says it's you." Facial recognition is not the only example of algorithmic bias.


Machine learning, a subset of AI, powers many technologies. Huge data sets are used to train machine-learning models to make predictions. So we want a computer, for example, to learn the difference between a dog and a cat. We can show the computer a lot of examples of what a dog looks like, show the computer a lot of examples of what a cat looks like. And as a result of that, the computer kind of figures out that, well, pointy ears means a cat. Whiskers might mean a cat. Floppy ears means a dog, and uses these features in order to differentiate between any new image that it might see of what could potentially be a dog or a cat. If all of your cats are white and all of your dogs are gray and black, what happens if you put in a photo of a black cat? All of the cats that this algorithm has learned are cats are white.


Everything that it's seen is black is a dog. And so, the algorithm may decide, oh, this is black, it must be a dog. In reality, it's a black cat. Sometimes a data set that lacks diversity is the problem. Other times, the design of the model is the issue. It can be a completely unaccountable process. These data sets are so large. The AI system is analyzing data that they haven't, they can't even know the full scope of. It's sort of on the order of millions and millions of examples. The algorithm is sort of identifying patterns in order to create a relationship between inputs and outputs, but those patterns are not necessarily things that humans can understand or see very well. These algorithms should be able to learn over time, they should be able to incorporate new data from the people using it. Even if you create an algorithm, you check it for bias and it's fine, the new data, the way people use the algorithm, could work the algorithm over time and make it biased.


You know as an example of what could go wrong, a ProPublica article looked at software that's used to predict future criminals. Who is more likely or less likely to commit a crime? And a ProPublica investigation found that that algorithm, which was pretty widely used, was biased against Black people. And the issue here is that that algorithm gets used for all sorts of things. It gets used for deciding who gets what sort of bail, who gets what sort of sentence, who gets parole. And the algorithm was considering Black defendants as a greater risk than White defendants. And ProPublica tracked some of these cases, and it actually turned out that the White defendant did re-offend and the Black defendant didn't. The unwillingness on the part of some tech companies to fully disclose their algorithms makes it hard to hold them accountable.


In April, the U.S. Federal Trade Commission warned that it could crack down on companies that use biased AI software. But overall, regulation has been slow. So we're only going to see these AI systems being more and more prevalent, invisibly ubiquitous in society. If we don't do a good job of really thinking about how we're going to hold them accountable now, it's going to be even much more difficult later on to try to reel things back.


Just as with the ongoing fight to create more equitable political and socioeconomic systems, overcoming racism in product design may require more than simply renaming, rebranding or recoding. These changes, although helpful, only scratch the surface of the deeper problem. It just says a lot about corporate America when Black people ask for equality for hundreds of years, and then they're heard in a span of three months last year, and the solution was changing a box of pancakes..

african instruments

https://howtoplaythedjembedrums.com/how-product-design-was-transformed-by-2020/

No comments:

Post a Comment