Recently, I read a LinkedIn post by Ana Mouta about Marc Augé’s An Introduction to Supermodernity, and her reflections on his concept of non-places set me thinking. According to anthropologist Marc Augé, non-places are spaces of transition and functionality without relational depth, history, or community. They reduce us to passengers, users, and data.

In our highly transactional culture, where speed, efficiency, and measurable outcomes dominate, even streets and communities begin to resemble non-places. The relational field erodes. This is not the fault of technology itself, but of the transactional thinking that hollows out the complex, dynamic field of relationship. A field of spontaneity, reciprocity, and co-creation.


Non-Places and transactional thinking

Augé’s non-places, products of super-modernity, are spaces of abundance, acceleration, and individualization, where connection is absent. In airports, digital interfaces, or streets where people pass without contact, our relationality is reduced to functional roles.

Communities, digital or physical, often become non-places when we try to design relationships as systems rather than allow their inherent complexity to unfold.

Our relational language (words like community or relation) is drained into engagement metrics. Even physical spaces lose their anthropological depth. This is the symptom of a culture that values transactions over relationships, speed and outcomes over the living dynamics of the relational field.


The Relational Layer: dynamic and non-standardizable

Why do so many communities, digital or physical, falter? Often we try to engineer relationships as systems — structured, measurable, predictable. Yet a community stays alive only through a dynamic relational layer that sits above the system logic.

Relationships are not algorithms. They arise in a semi-structured, living field that cannot be standardized without losing its essence. They depend on intentionality: a conscious participation that grows from meaning rather than obligation.

Intentionality is the choice to engage because the interaction matters, not because a system demands it. It animates the tidepool: the rhythms between designed structure and emergent spontaneity.

Relational intentionality also involves sensegiving, the act of offering perception and allowing it to be shaped by others. Meaning doesn’t arise from isolated sensemaking but from shared attention. A dynamic process of giving, receiving, and adjusting. Without this exchange, even well-designed communities risk becoming procedural rather than relational.

Here lies an essential distinction: vitality does not emerge from better design alone, but from the relational metabolism between structure and spontaneity (between the designed system and the emergent field).
The system provides channels for expression, but the field provides life. When these two intertwine, when intentionality seeps through design, the relational field begins to breathe.

The vitality of a community resembles a tidepool: form and flow coexist. Structure provides edges that hold the space, while relational dynamics, like water, move within. If the edges harden, life suffocates; if they dissolve, form disappears. Relational design, like the tidepool, depends on permeability.

It is not enough to design platforms with clear goals and tools. We must make room for reflection, ambiguity, and co-creation. The interplay between system logic and relational depth is the motor of a living community. When this interplay fails: eg, when rigid structures or empty profiles leave people unable to position themselves, the community becomes a non-place. Functional but hollow.

Semi-structured conversations can help sustain the field. And perhaps this is where AI, and particularly large language models (LLMs), can play a role.


The LLM as reflective threshold

LLMs, often accused of being quintessential non-places, offer quick answers but lack a living relational field. Yet precisely because an LLM has no emotions or agenda, it can become a neutral mirror. It may invite us to examine our own position in the tidepool of relationships.

“AI can act as a co-participant in inquiry, not a vending machine for answers, but a mirror that invites relational reflection.”

Still, the mirror carries an ontological risk: it can become a hall of reflections, trapping us in self-referential loops. Even well-intentioned users often “default to narrow-boundary problem-solving” when faced with ontological disruption. If we seek only recognition or validation, we narrow the relational aperture that the mirror could widen.

To use AI as a reflective threshold rather than a trap, we must cultivate wide-boundary inquiry: we need to learn to stay with paradox and discomfort instead of rushing toward closure. An LLM can prompt this by asking questions such as:
“Does this exchange nourish you?”
“What tension remains unresolved, and what might it teach?”
These prompts re-open the field, inviting depth rather than speed.

To use AI as a reflective threshold rather than a trap, we must cultivate wide-boundary inquiry...

Because LLMs lack social status and emotion, they can sidestep certain human games of power or approval. Yet their design still carries a transactional bias. The pull toward efficiency and satisfaction. Human facilitation remains crucial to hold the wider relational field.

Yet even if large language models could serve as neutral digital authorities, we must remember that they are created and sustained by organizations guided by transactional logics: efficiency, scale, and market value. This introduces a deeper ethical question: what happens to the data we offer? Who holds it, benefits from it, or uses it to train the next generation of systems?

Rethinking the entire infrastructure through a relational lens, in alignment with frameworks like GDPR and with the principle that individuals remain stewards of their own data, could move us closer to genuine reciprocity. Data ethics, then, is not just a matter of compliance, but of relationship: how consent, care, and transparency shape the trust between humans and the systems they co-create.


A hybrid approach: digital and human

Here lies the promise of a hybrid model. A digital companion can facilitate self-reflection, drawing on aggregated and anonymized data to reveal group patterns (eg. differences in rhythm, expectation, or mood) without exposing individuals.

When combined with human guidance, this interplay between digital structure and embodied presence can transform non-places into spaces of meaning.

This requires redesigning not only technology but our habits around it. Most LLMs are built for transactional use, guiding users deeper into endless information streams. We scroll, conditioned to seek resolution, yet reflection withers under overload.

“We move from worshipping the oracle to worshipping the method.”

Designers can interrupt this by integrating pauses, silences, and reflective questions: simple prompts such as “why does this question matter to you?”
And users can choose to engage with technology as a mirror for reflection rather than a dispenser of content.


Shared Discernment and Relational Responsibility

The transformation of non-places begins with shared responsibility. More than ethical instruction, it calls for meta-relational discernment.

Instead of “Designers should…” or “Users must…,” we might say:

“We can learn together to sense when technology flattens relationality, and when it invites depth.

This shift — from prescription to shared sensing — honors the complexity of our entanglement.

Designers, including the programmers who train algorithms, hold a particular responsibility: to cultivate awareness of their own assumptions and to question the biases embedded in data, code, and optimization goals. Algorithms are never neutral. They mirror the cultural logics that shape them. When these logics go unexamined, they can perpetuate harm, even when following rules intended to be ethically correct. Inclusiveness is not achieved through rigid compliance or rule-based morality, but through ongoing relational awareness. A willingness to ask, “who or what is still being excluded by this framework?” The task is not one of blind adherence to either efficiency or ethics-as-procedure, but rather conscious design that values nuance, responsiveness, and relational integrity.

Inclusiveness is not achieved through rigid compliance or rule-based morality, but through ongoing relational awareness.

Users, too, have agency. They can approach technology as a co-participant in meaning-making rather than being led by the prompts and feedback loops that glue attention to the screen and spiral it into endless information streams. Streams that are neither functional nor relational.

When both sides practice this discernment, technology becomes less a conveyor of content and more a living threshold: a space where human and digital intelligences can meet, question, and co-weave meaning.

Paradoxically, the transactional nature of AI reveals what it means to be human. Meaning arises from our embodied, relational, and temporal presence. Capacities that AI amplifies but does not possess. Working with AI, then, is less about control than about learning to see what is hidden: the biases in data, the assumptions in design, and the reversibility of perception itself. Relational awareness becomes the heart of any ethical encounter with technology.


Relational Mirror: a pause for the reader

Notice how you are reading this piece. Are you skimming for insights, or letting the words breathe through you? Every act of reading is relational: a tiny field forming between your attention and the text’s pulse. The question is not only “What does this article say?” but “What is it doing to the space between us?”

Technology often invites us to manage life, but relational depth asks us to tend to it. When you next interact with an AI, a community, or a stranger online, pause and sense:
Is this exchange flattening or deepening the field?
What happens if you don’t rush to make meaning but let meaning emerge: slowly, metabolically, like soil forming underfoot?

Thank you, Ana Moura


💡
This article was adapted from a post on Evelien Verschroeven's Medium page, where you can find more explorations of relational anthropology that invite you to reframe what you think you know.