OPIAS: Over-Personalization in Information Access Systems
Pınar Barlas, Kyriakos Kyriakou, Antrea Chrysanthou, Styliani Kleanthous, & Jahna Otterbacher
OPIAS [doi] is a suite of tools exploring Over-Personalization in Information Access Systems. A previous tool demonstrated filter bubbles on search engines; the current tool demonstrates the (over-)personalization process and the resulting filter bubbles on social media, mimicking the widely-popular platform Facebook. Both demos can be tailored for any topic, point of view, or audience, and used for conducting research.
User walkthrough (Front end)
Introduction page:
Information on the creators, the purpose, and the privacy policy of the demo, along with a consent form, required for initiating the demo.
Home page:
> Italic text below indicates a feature that can be edited by the administrator.
Replicates the interface of Facebook. In our example setup, the user will come across two informative posts shared by friends; a link explaining what nuclear energy is, and an image showing examples of genetically modified organisms (GMOs). There are comments on each post, some in support of the subject matter and some against it; one person comments that “it’s great GMOs allow us to grow more crops per year” (supporting), while another comments that “people are getting poisoned by eating vegetables grown from GMOs” (against).
When the user ‘likes’ a comment or a post, the user model is updated with the ‘liked’ point of view and our recommendation algorithm chooses the next post accordingly. If the user ‘likes’ a comment against GMOs, the next post that loads is one expressing a negative view about GMOs. The user sees and interacts with two more posts before the page stops scrolling with a button to ‘finish & see an explanation’.
User’s possible paths through the demo; blue arrows indicate the actual path taken by the user in the example, and the full-opacity boxes represent the posts seen by the user.
Research Question pages:
Admin may include a page with questions before and/or after the Home page, if the demo is used for research.
Explanation page:
Displays each post that the user interacted with, and for each, how the user’s action (‘like’) affected the user model, and from there how the next post on their feed was chosen.

We then display the posts that were hidden from the user by the recommendation algorithm. These are the posts from the ”opposite” point of view on that topic (supporting GMOs) and all posts from the topics with which the user did not interact (nuclear energy).

Finally, we also provide further information on over-personalization and basic tips for increased user control on social media.
Researcher walkthrough (Admin panel)
The admin can tailor the user’s experience. For example the admin can:
- hide/show, add, edit, or delete topics (e.g. nuclear energy, GMOs).
- hide/show, add, edit, or delete posts (links or images) within topics.
- hide/show, add, edit, or delete each post’s comments.
- ”enforce extremism” by hiding comments that disagree with a post. This disables the opportunity for the user to see the ‘opposite’ point of view after Stage 1 and locks them into the view that they interacted with first.
- set the total number of posts seen by the user (called ‘stages’).
- hide/show, add, edit, or delete a research study.
- hide/show, add, edit, or delete questions within a research study.
- download interaction data and answers to the research questions.
- hide/show, add, edit, or delete ”dummy posts” meant to simulate a real social media feed.
- toggle the ‘clickable links‘ feature, (dis)allowing the link posts to lead to static pages.
- add, edit, or delete accounts that can access the admin panel.