Could we give users the power to curate?
Censorship is no longer a discussion of the prohibition of content. With the massive democratization of publishing platforms, the influx of content has created a new opportunity for censorship: information overload and attention redirection. 500 hours of video are uploaded to YouTube every minute, The New York Times posts 250 pieces of content every day. Our president tweets over 4,000 times per year. It’s a lot to manage. The curation of content is the biggest threat to censorship.
Personalization of content creates filter bubbles that amplify existing biases and essentially forces us to live in different realities. This personalized reality decreases the quality of information we consume, lowers the likelihood that we will consider (or even hear) opposing viewpoints, and ultimately ruins civil discourse. After the 2016 presidential election, the polarization and manipulation of content have been widely discussed around the globe.
Curation of content is not simply a taste issue or an entertainment issue. The curation of content is at the core of a productive democracy. For something so important, we must ask: could we give users the power to curate the content they consume?
Ultimately, the goal of effective curation would be to develop an unbiased understanding of the world that is free of fractured realities or perspectives. Differences of opinion are welcome – but those conversations should be able to exist on the same plane. If curation continues to polarize, there will be no equal ground to stand on.
Risks and Benefits of User Curation
To first understand this question, I wanted to clearly debate the risks and benefits associated with user-controlled curation.
What are the risks associated with giving users the power to curate?
- Users prefer echo chambers. These filter bubbles offer the reassurance of your opinion, reinforce existing biases, and keep you engaged with content that you’ve been proven to enjoy. This could worsen divides.
- Curation requires prior knowledge. To truly curate a truly broad and representative view of a topic, proper knowledge is helpful. How can you represent multiple viewpoints on a topic if you don’t understand it?
- Information overload could cause opt-out. If users aren’t fully empowered to curate content effectively, they could be overwhelmed and opt-out completely. Is biased information better than none at all?
- Do they want power? If not given the proper tools to curate effectively, the cognitive load associated with decision-making could be too great. What if you don’t want to think?
- Will misinformation worsen? Are users informed and engaged enough to fight social control and propaganda?
What are the benefits associated with giving users the power to curate?
- Creates awareness of biases. By actively engaging in content curation to counteract bias, you will become more acutely aware of existing biases.
- Rebalances power dynamics. In the most extreme cases like content bans in China to nipple bans on Instagram, the ability to curate is the hands of the powerful. By giving control make to users, we can work to redistribute power.
- Respects user autonomy. In addition to rebalancing power, giving control back to the user also respects their autonomy, intellect, and ability to choose.
- Teaches to combat misinformation. This is an area that is becoming increasingly urgent. As deepfakes and AI-assisted content creation become more popular, it’s vital for citizens to continue to fine-tune their filters for real content and misinformation. Relying completely on platforms to filter content trains users to be complacent over time. We must continue to ask ourselves: is this a credible source? Is this content logical? Can I fact-check this before sharing?
Applying My Ethical Framework
With these benefits and risks clearly displayed, I ran this problem through my framework. With individual autonomy and respect as core tenants of my ethical framework, I strongly believe that we should design products and services to give users more power to curate their content. The strongest argument for me lies around teaching helplessness. If we never give users the power to curate, how will we ever learn how to identify biased, false, or misleading information?
This artifact helped me understand where existing platforms lie, and where there are areas for opportunity. Escape Your Bubble, ConsiderIt, and Balancer were all tools we read about to counteract bias and create a more informed user. Despite the effectiveness of these tools, most of our day-to-day consumption exists in the echo chamber section. Because these platforms are personalized, it gives us a false sense of control and showcases content that feels resonant. This false sense of control keeps us from seeking more autonomy and keeps us complacent with the content that is given to us.
How can we actually give users control while still keeping them engaged?