Popcorn Hack #1
An example of bias in media is the lack of playable female characters in early video games, such as The Legend of Zelda series, where Zelda is often portrayed as a damsel rather than a protagonist. This affects female gamers by limiting representation and reinforcing gender stereotypes. A potential cause of this bias is the historical male dominance in the gaming industry, where developers primarily catered to a male audience and designed games based on traditional gender roles.
Popcorn Hack #2
I once used a website that required a specific type of CAPTCHA to log in, but the images were too small and unclear, making it difficult to complete. After multiple failed attempts, I felt annoyed and excluded, as if the system wasn’t considering users with visual impairments or even just bad lighting conditions. One way to improve this would be to offer an accessible alternative, like an audio CAPTCHA or a simple checkbox verification, to make the process easier for all users.
Popcorn Hack #3
Bias could sneak into a fitness tracking app if its recommendations assume all users have the same physical abilities, age, or health conditions. For example, if the app sets step goals based on average adult activity levels, it might be unfair to older adults, people with disabilities, or those recovering from injuries. To ensure fairness and inclusivity, the app could include customizable goals based on individual mobility levels, allow users to input health conditions that adjust recommendations, and offer adaptive workout suggestions that accommodate different physical abilities. Adding voice guidance, larger text options, and alternative exercise modes would also improve accessibility for diverse users.
Homrwork Hack #1
Digital Tool: YouTube
Potential Bias: YouTube’s recommendation algorithm often promotes content similar to what a user has already watched, which can create filter bubbles. This means users might only see videos that reinforce their existing interests or viewpoints, limiting exposure to diverse perspectives. Additionally, some creators have reported that their content is less promoted due to factors like language, topic, or demographic bias in the algorithm.
Cause of Bias: The bias likely comes from YouTube’s data collection and algorithm design, which prioritizes watch time and engagement. If certain types of content generate more views or ad revenue, the system may unintentionally favor them while overlooking diverse creators or niche content.
Solution: YouTube could improve inclusivity by allowing users to adjust their recommendation settings, such as choosing to see more diverse content outside their usual interests. Additionally, a transparency tool showing why a video was recommended could help users understand and control their viewing experience better.