Nordic Personalization Workshop

September 16-17, Gothenburg, Sweden

R&D

participants

Nordic

perspective on
personalization

Two days

of talks, discussions
and workshops

The Nordic Personalisation Workshop

We invite all researchers and practitioners in the Nordics to the second Nordic Personalisation Workshop at the Wallenberg Conference Center, Gothenburg. We aim on two days full of talks and discussions on ethics, responsibility in personalisation and recommender systems. There will also be plenty of time to meet peers both from industry and research and exchange ideas during the conference during coffee breaks, lunch and after-workshop drinks.

What to expect

What to expect

Technological advancements have transformed media consumption into an omnipresent and decentralized activity. Social media, in particular, has become a crucial channel for media’s constant accessibility. However, the sheer volume of information has made it increasingly difficult for users to manage their media intake. Recommender systems, a branch of Artificial Intelligence (AI), have become essential in helping users sift through this deluge. These platforms utilize AI to filter content, aiming to match it with the user’s preferences, often unbeknownst to them.

AI thus enables media providers to offer personalized experiences by discerning user preferences. Despite AI’s clear benefits — such as aiding users in discovering relevant content and managing the influx of information — its potential adverse effects cannot be ignored. AI systems have the propensity to reinforce existing biases, leading to skewed and inequitable recommendations. Such tendencies have brought algorithmic discrimination to the forefront of public attention. This term refers to the biased treatment of individuals or groups based on attributes like gender, age, or ethnicity.

Other negative consequences of media recommender systems include the creation of filter bubbles and echo chambers, which restrict users’ exposure to a variety of viewpoints. Excessive personalization by algorithms can trap users within a narrow media spectrum, potentially fostering polarized content ecosystems.

The absence of editorial oversight, especially on social media platforms, raises the risk of misinformation propagation.

While algorithms are not inherently designed to promote false information, they play a significant role in its distribution, especially as they tend to favor content that gains popularity, thus reflecting a ‘popularity bias’.

Developing responsible and equitable algorithms presents a considerable challenge. It is vital to foster a collective understanding of various algorithmic methods and their societal implications. Our 2nd Nordic Personalisation Workshop seeks to tackle these issues. The two day event will be held this year at Wallenberg Conference Center, Gothenburg (last year in Oslo).

Call for contributions

Call for contributions

We invite all researchers and practitioners in the Nordics (and elsewhere) to submit their latest work on personalisation and recommender systems to the 2nd Nordic Personalisation Workshop in Gothenburg (September 16-17) as a talk or a poster. Talks and posters can focus on recent research results (including significant work-in-progress), applications and use cases of personalization and recommender systems, and directions of future research.

To present your work at the workshop, submit an abstract of your work and select what type of presentation you prefer. The abstract must be in plain text and no longer than 500 words, not including bibliographic references. Please submit your abstract

Talks will be allotted 10 minutes of presentation time during the workshop, while posters will be given a one-minute slot in a poster madness session and presented in a poster session. Please submit your contribution abstract here.

Submit

Venue

The workshop will be held at the Wallberg Conference Center in Gothenburg. Participation is free of charge. Lunch, snacks, and coffee are included on both days. A light dinner is included on the first day.