Dries Buytaert is something of a legend in the computer science community. As the founder and lead developer of open source content management system Dupal, he is responsible for the technology that underpins one in 40 of the world’s websites.


Now chief technology officer of open source software-as-a-service platform Acquia, Buytaert remains a key voice in the online technology space. However, he is deeply worried about how it has and continues to develop, particularly when it comes to the biggest names in the space, known collectively as Big Tech, and their use of algorithms.


“I think there are a bunch of problems, obviously, all the way from having too much power in terms of monopoly, in terms of concerns around privacy and concerns also around the algorithmic influence, for lack of a better term,” says Buytaert.


“And personally, that's what I'm actually the most concerned about. Because I think for some of the other things, there are solutions.


“We have laws that deal with monopolies, right? But if I think about algorithms, as you know, algorithms kind of rule the world.”

Dries Buytaert, CTO, Acquia

Algorithms in the machine: Dangerous, unregulated and out of control?

The problem with algorithms deployed by Big Tech, Buytaert argues, is they have the potential to be powerfully and arguably dangerously influential on people’s lives.


“I think more and more algorithms can be very influential and important in terms of how we live and how we conduct ourselves,” he says.


“And to me, that's a really hard problem, because the algorithms are in the hands of large organisations. And there is really no oversight today, like zero.”


He argues that there are resulting issues across many domains, from democracy to criminal justice and beyond.


“Google is known to be able to influence election results, just by tweaking the search results for certain keywords, they can effectively influence or manipulate people's thinking,” he says referring to 2015 findings that its algorithm could unintentionally influence voter choice.

“I don't think any of these companies were born with malintent. They all had a noble goal.”

“People are sent to jail based on DNA tests, these DNA tests, they use software to calculate them and see if there's matches. I think about self-driving cars and how they do collision avoidance; software in these cars that decides, do I save the passenger? Or do I save the pedestrians?


“There's other articles about racist biases in search algorithms. It's just not okay. And obviously when they find out they try to fix it, but the fact is, that happens. And it may have happened for decades before they find out.”


The narrative of Big Tech doing more harm than good is increasingly prevalent, but Buytaert is keen to stress that the issues he sees with algorithms are not designed in, but instead are generally side-effects of well-meaning projects.


“I don't think any of these companies were born with malintent. They all had a noble goal,” he says.


“I think they all believe in doing the right thing and doing good things. All of this is unintentional.”


But intentional or not, it is an issue. And notably, he argues that while other issues with Big Tech are beginning to be explored, for example with recent congressional hearings in the US, the lack of oversight of algorithms is not being addressed.


“I think they're focused a little bit too much like right now on, for example, the power that Amazon has on its resellers or suppliers. And they're talking about: maybe we should split up these companies because they're too big and too powerful, but it doesn't solve the algorithmic oversight problem,” he says.

Regulating the unregulatable: An FDA for algorithms?

The situation, Buytaert says, reminds him of the early days of medicine or mass food production.


“It was unregulated: everybody could cook up some medicine in their kitchen and then sell it. And obviously, that's a pretty bad idea,” he explains.


“And so in the US that led to the Food and Drug Act and the FDA, and then in Europe there's a similar agency responsible for approving drugs and setting standards for food quality and these kinds of things.


“It feels like we need something similar to me; some sort of body that can make sure these algorithms that have impact on society are okay.”


The notion of a regulating body for algorithms is a compelling idea in principle. But the reality of setting such an organisation up is, says Buytaert, “daunting”.


“I don't even know how you start doing that. I can't imagine anyone starting to audit Google's algorithm. You know what I mean? It seems so massive,” he says.

“I can't imagine anyone starting to audit Google's algorithm. You know what I mean? It seems so massive.”

There have been some moves towards the potential regulation of algorithms, such as the EU’s Ethics Guidelines for Trustworthy AI. However, these are entirely advisory and do not require companies to comply with them.


Added to the challenge is the fact that this isn’t simply a national concern, but an international one, involving technologies that bridge borders and cross continents. And as a result, a global organisation may be required to tackle the issue.


“I'm a member of the World Economic Forum, young global leader it’s called, but they focus on multi-stakeholder problems they call it, so basically the kinds of problems that are hard to fix by one country,” he says.


“So I do think it needs some kind of global body, which just makes it even more complicated. It would stifle innovation; the consequences are pretty drastic, but at the same time, it does feel like at some point that would be good.”

Machine learning and open source: Using tech to combat tech

Given the scale of the challenge, the solution may not be armies of humans, but another set of algorithms designed to police the first.


“I guess in my mind, maybe some of it could be automated,” says Buytaert.


“I think we probably need to look at the world of algorithms and see which ones we want to audit, because not everything needs to be audited.


“Maybe some of it could be done through machine learning-based solutions over time, but it seems like a massive amount of work actually to go and audit these algorithms and then keep them audited too.”


Here he believes open source can play a role in ensuring algorithmic transparency.


“I'm born in open source, I’ve spent my whole professional career in open source but everybody can look at the code,” he explains.

“I think software that's used by the government, it's built with public money, tax dollars, why is that not open source?”

“So thousands of people have looked at Drupal’s code, for example, so everybody can see what Drupal does. It doesn't do anything shady, does it? Are there bugs in the algorithm? Are there biases in the software? Open source is, in a way, the ultimate form of transparency.”


Buytaert also sees a valid case for making government code open source in most cases.


“I think software that's used by the government, it's built with public money, tax dollars, why is that not open source?” he says.


“At least we could all look at it and we could help make it better too, so I think open source is definitely the way to go.”


Similarly, he sees mandating algorithm audits as a way of incentivising businesses to make their code open source.


“If you want to deal with the audits, which maybe cost you hundreds of thousands of dollars or millions of dollars. It might become more compelling to [make code open source],” he says.


He adds that his own company Acquia, which makes all its code open source, is proof that companies do not need to make their code proprietary to be profitable.


“So there are open source business models that are very successful, and we don't care that people see our code. It doesn't prevent us from being a successful business.”

Championing change

Taking on the behemoth that is algorithms in Big Tech is no small challenge, and Buytaert argues that “it’s a hard problem” that will take many steps to tackle.


“It starts with creating a policy and finding ways to enforce the policy, which is the hardest part. And you can use financial penalties that drive the self-enforcing of the policy,” he says, pointing to GDPR as a model. “But it does require a champion.”


Buytaert is not aware of any would-be champions in government at present, although he says he is “sure some of them exist”.

“I think that if we were to have another call in three years, we would have the same problems.”

“We're not the only ones who are concerned about it, so I'm sure people are trying to figure out what to do about it.”


However, with such a challenge ahead, he isn’t optimistic about changes being made in the next few years.


“I think that if we were to have another call in three years, we would have the same problems,” he says.


“Honestly, I think there might be more examples of how things have gone wrong; there might be more awareness. But I don't see an overnight solution.


“That’s my honest gut feel. But I don't know. Maybe I'll be surprised.”

Back to top

Share this article