Meet the Team: Nada

As a teenager in Lebanon, on the margins of this globalized world, I fell in love with the Internet because it offered me a window to the magic of the ‘outside world’. The beautiful kinship-based chaos of early-days social media and chat rooms is also how I discovered amazing global and local communities of people I would have otherwise never met.

These days, I don’t like social media so much. Mostly because we are being told that the only ways to connect are the rather shitty ways. That the price of belonging to a community is to let yourself be packaged as a product, and accept that kinship online is now conditional.

I got to experience firsthand the extent of the problem when I worked for big social media platforms, in roles meant to support people and keep them safe. Problems I raised were routinely dismissed as edge-cases, a hindrance to the magic of code. I learned that tech is sacred, and that individuals must wrap around the data cells and fit.

The thing is, technology is not magic. Yet we still tread code like a Big Book of Spells, instead of acknowledging that it’s just a chain of choices and causalities. We treat tech giants like divine institutions doing work that cannot be questioned, when instead they’re just a big bureaucratic mess trying to keep up with code. They hide the extent of human labor needed to keep online spaces vaguely functional, and they treat the negative consequences like nothing more than a Public Relations problem. It doesn’t have to be this way. Tech does not have to be a data-mining operation that hurts people in the process.

This is why I joined the team trying to make Darcy happen. Not just because it’s decentralized and open-source, but mostly because we’re prioritizing well-researched interaction models over the ‘code first, think later’ approach.

We’re trying to build a space where marginalized individuals are not edge-cases to the magic of code. And what matters the most to me, is that we’re not hiding the reality of the important human work it takes to support an online platform. At the forefront on this work is Content Moderation. Something that cannot be treated as just another tech-support scaling problem, nor be left to the unsupported contribution of isolated volunteers.

We care about moderation, as the backbone of a community, addressing complex legal, social, and political challenges. We care about making moderation decisions in an open discussion space, with relevant professionals involved, including mental health professionals who are absolutely necessary to help deal with the horribly problematic content that often characterize platform moderation.

We don’t have all the answers (yet), but we’re working on it, doing research, investigating potential solutions. It’s hard, it takes time, and it takes money. But if you think it’s worth it, consider funding us through our Patreon