How might we...
...design an empowering tool that puts people in control of the content in their feed?
Today social apps rely on billions of weighted calculations to make ad suggestions to people, so there is no way to connect an ad to a single behavior or action. Despite this growing sophistication, at the end of the day, it’s impossible to predict how a person will respond to an ad they see. Creating more transparent tools for them to understand and manage their data and control what ads they’re seeing, especially ads that feature potentially sensitive content, like alcohol, will increase engagement and improve their experience.
Friendlee is a social app that allows people to browse and share photos, videos, messages, news, and events with their friends. Likewise, people receive tailored content, ads, friend, and event suggestions based on past behavior, interests, and likes.
In order to provide the service, Friendlee is powered by some of the following data:
Now more than ever, people want to know how suggested content is being generated and give feedback when they see something they don’t like. Media coverage around data usage has left many concerned about their own information. This can be especially true when it comes to seeing sensitive ads (like alcohol, parenting, etc.), and people want features that allow them to respond and give them control over the ads suggested to them. Simple, empowering tools will rebuild trust and make people more comfortable connecting with friends and family on the app.
How might we...
...design an empowering tool that puts people in control of the content in their feed?
Sometimes people’s feedback on ads doesn’t fit in a box. Sometimes an ad features sensitive content that isn’t right for them in ways that an algorithm cannot predict. Friendlee's responsive button compliments the “like” and “comment” options, offering people a space to explain why an ad isn’t working for them.
Say hello to Algo. Algo listens to person’s ad feedback and helps them find a solution. To pinpoint an issue, Algo draws keywords from previous responses and matches them to metadata tags for potentially sensitive ads. From there, Algo offers a solution. Users can opt out of seeing specific triggering ads moving forward. For example, if a person is upset by a beer ad, Algo will remove all current and future alcohol ads from their feed. The conversation also provides the person with an empowering experience and gives them a space to be heard. Algo is a fast learner, each conversation trains the chatbot to improve people’s personalized ad experience moving forward.
This feature needs a smart chatbot able to empathetically engage with people. It also requires a large-scale keyword library mapped to metatags from potentially sensitive ads. It will also be expanded to include feedback for all content, not just ads. With time, the chatbot learns from the feedback to improve the algorithms and cuts down on the frequency of conversations. An improved ad experience will reduce the burden of reporting uncomfortable ads and boost app engagement.
How might we build on Friendlee’s ideas to give people simple ways to manage their data and improve the ad experience?