Bumble (mini DEC)
1. The organization Bumble
Bumble is an online dating app launched in 2014. Users of the app are offered a stack of profile cards that they can swipe left to reject or swipe right to indicate interest. Bumble is a product of Bumble Inc. It was founded by Whitney Wolfe Herd after she left dating-app Tinder. Her incentive was to create a feminist dating app. Therefore, Bumble is presented as a feminist dating app by designing it in a way that challenges outdated heterosexual dating norms. They say to empower women to make the first move by giving them the ability to control the conversation. Beyond women making the first move, they state that Bumble is a platform to empower all users to create safe and healthy connections.[1] The app is freemium, meaning that the core functionalities of the app can be used for free. The app offers different paid option features like 'see who liked you first' or 'backtrack on accidental left swipes'.
2. The AI technologies Employed
When you go to the Bumble website and click on the link under "how does Bumble work" you will only get tips on how to stand out on Bumble. No information is disclosed on the actual functionalities of the algorithm that they are using. According to an article from Medium,[2] Bumble's algorithm is based on the Elo rating system,[3] which was originally designed to rank chess players. You rise in the ranks based on how many people swipe right on you, but that was weighted based on who the swiper was. The higher rank (more desirable, more right swipes) they have, the more their swipe weights on your score. Thus, those with similar scores will see each other more often. We consider this algorithm to not be a self-learning algorithm but a rule-based algorithm. The algorithm is in some form making use of Collaborative Filtering.[4] Collaborative Filtering is a method used in recommender systems based on the principle that if user A has the same interest in a product as user B, then user A likely also has the similar interests in other products that user B has interest in.
3. Ethical concerns
Bumble claims to be proactive in creating a safe and empowering environment, especially for women. I believe these values are good starting points from which to create a dating app. However, there are several ethical concerns associated with its AI technologies:
Bias in the Algorithm
The use of the Elo rating system and Collaborative Filtering can introduce biases. Since the algorithm prioritizes users who receive more right swipes, it can marginalize users who are less frequently swiped on, including those from marginalized groups. This phenomenon can lead to a form of discrimination. A relevant case involves Breeze Social B.V., another dating app. According to the Dutch College for Human Rights, Breeze's self-learning algorithm resulted in indirect discrimination based on race (Judgment Number 2023-82, 01-08-2023). Breeze's algorithm, which calculated match chances based on user preferences and like behavior, unintentionally lowered the match chances for users with darker skin or non-Dutch backgrounds. This led to fewer matches and reduced app usage among these users, despite Breeze's intention to promote diversity. The College ruled that although Breeze did not directly discriminate, its algorithm reinforced user biases, resulting in indirect discrimination. Breeze was advised to take measures to prevent such discrimination, emphasizing that it was not only permissible but required to adjust the algorithm to ensure fair treatment of all users. Although in the case of Bumble, it's algorithm is rule-based rather than self-learning, similar biases could still arise. Ensuring that Bumble’s algorithm does not reinforce biases against certain demographics is crucial for maintaining fairness and inclusivity. Regular audits should be done to check the state of discrimination. Additionally, following the Breeze example, Bumble should actively take measures to prevent indirect discrimination and promote diversity among users.
Privacy and Safety
Bumble has implemented various safeguards to ensure user privacy and safety. For account verification, users are required to provide their phone numbers and may be asked to undergo photo verification. This helps prevent fake accounts and malicious activities. Users can also choose to verify their profile photos voluntarily to receive a blue 'verified' badge. When a user chooses to verify themselves with the photo verification option, Bumble scans the submitted photo using facial recognition technology to compare the biometric data with the profile pictures uploaded. These scans are retained for future verification and record-keeping purposes for up to three years after the user's last interaction with the app (see Bumble privacy statement[5]). However, the term "record-keeping purposes" is vague. Using the term "record-keeping purposes" raises concerns about data storage. It makes me wonder if they are using this facial data beyond immediate verification needs. Clearer communication regarding the specific purposes and duration of biometric data retention is essential to build trust and comply with privacy standards. Bumble should explicitly state why this data is retained, how it is protected, and ensure that it is deleted securely once it is no longer needed.
Addictive Behavior and Incentive Mismatching
Because Bumble's business model is aimed at maximizing user engagement, there is a concern that it can lead to addictive behavior among users. The app's algorithm might be designed to keep users swiping longer. They might know that when a good match comes around, they propose another good match to keep you on the app a bit longer because you are now conditioned to think that something better might come around. This practice is in contrary with Bumble's stated goal of fostering genuine connections, as it may prioritize prolonged engagement over successful matchmaking. Transparent communication about the app’s intentions and mechanisms is necessary to align user expectations with the company’s practices.
4. Recommendations
To address the concerns highlighted above, Bumble should investigate potential biases introduced by the Elo rating system and Collaborative Filtering, implementing measures to ensure marginalized groups are not unfairly disadvantaged. Regular audits and bias testing should be conducted to identify and mitigate any possible discrimination. Bumble's takes good efforts to keep the platform safe from bots and fake profiles. However, the company should be more transparent about the use and record-keeping of biometric data. Detailed explanations about the purposes of the record-keeping and the security measures in place to protect user information should be provided, with consideration given to minimizing the retention period for biometric data to the shortest necessary duration. Additionally, Bumble should ensure that the app's algorithm does not encourage addictive behavior by deliberately delaying or manipulating matches to keep users engaged as this is in conflict with Bumble's initial values of wanting to create genuine relationships. Lastly, Bumble should provide users with tools to manage their usage and avoid developing addictive habits. By addressing these ethical concerns, Bumble can enhance its reputation as a feminist dating app committed to safety, fairness, and genuine connections.