Software helps shape society – and that brings responsibilities
Inaugural lecture Fabiano Dalpiaz

Software is not just a technical product; it is something that helps shape the society we live in. That comes with a greater societal responsibility. This was the central argument made by Fabiano Dalpiaz during his inaugural lecture as Professor of Software Production.
Can you still find your way without a Maps application? Or translate a foreign text without an AI-assistant? How frequently do you have online meetings with your colleagues? “Software influences our daily lives,” says Professor Fabiano Dalpiaz, who delivered his inaugural address on 20 May. “Maps applications affect traffic flows, and social media apps are a big factor in building our social relationships.”
Idealistically, one could expect software to be developed to address the needs and desires of its users. They should define the criteria the software must meet – functionality, but also wishes about usability and security. “People can express their needs, for instance, by leaving a rating in an app store.” But this is only part of the story. “Most software is built by companies, and their decisions, driven by commercial interests, tend to follow a different logic.”
Confidential requirements
These needs and desires are formalised in what are known as requirements, which developers use as the basis for designing the software. “But those are almost always confidential,” Dalpiaz explains. “And if the requirements aren’t public, how can anyone know whether the software does what it’s supposed to? Or – especially in the case of AI – whether the potential harmful consequences have been seriously considered?”
He gives the example of digital addiction. “This is a major issue for young people. A developer might say: I’ve built a social media app that meets all the technical specifications – without giving any thought to the ethical implications. And if children end up addicted and staying up all night on their phones, that’s supposedly not their responsibility.”
According to Dalpiaz, it’s time for a culture shift in the world of software development. “We shouldn’t see software merely as a technical product, but as something that plays an active role in shaping our society. And that calls for greater societal responsibility.”
If the requirements aren’t public, how can anyone know whether potential harmful consequences have been considered?
He advocates for more transparency around requirements, and for incorporating human values into those specifications. That way, users can better assess whether their interests are taken into account when choosing which software to use – and companies can be held accountable for any harm their software may cause.
Dalpiaz’s research group works closely with industry partners and sees signs of a shift already under way. “Although software and its requirements are generally regarded as confidential assets, we are witnessing how companies are increasingly requested to provide evidence they comply with societal regulations (such as GDPR or the AI Act) when building software. This shows how legislation is a powerful mechanism.”
Ethical engineers
At the same time, Dalpiaz sees a hopeful trend: the rise of ethical engineers – software developers who are increasingly aware of the ethical implications of their work. “Thanks to artificial intelligence, you no longer need a hardcore computer science background to develop software. That opens the door to people with backgrounds in psychology, sociology, or law to engage with these issues. And that diversity is essential if we want this movement to grow.”