Surveillance and Identity in the Age of Digital Face Recognition
Technology is not neutral
'Face Value' investigates the changing meaning of the human face in a digitised society where our portraits are continually captured and screened. Decisions about when, why and how you are identified are not neutral – they are defined by structures of in- and exclusion. Biometric technologies identify individuals based on their physical and behavioural traits, but research shows that biometric systems are biased. They can amplify racism, sexism and other forms of discrimination.
What gets lost (or gained)
The exhibition presents five artistic positions that critically investigate the way technologies capture facial information; the social, political and cultural consequences of the phenomenon; and how we can reclaim our faces. The artworks invite you to reflect on what gets lost when human bodies, voices and emotions are captured in binary code. The exhibition also explores how facial technologies can be used in alternative ways that allow for intimate connections between technologies, physical bodies and communities.
This exhibition is part of Wevers' PhD research, which has been funded by the Dutch Research Council (NWO).