Professionalizing Humanitarian Interpreters?

When I began training interpreters for the ICRC in 2010, I believed that the professionalization of humanitarian interpreting was merely a matter of training and resources. Twelve years later, my thinking on the issue has evolved quite a bit and I am no longer sure that “professionalization and training” is the right approach to humanitarian interpreting.

Why that is the case is explained in the video lecture below:

#MemorableMultilinguals: Africans*

I cannot count or recount the number of times that a European who is more or less closely involved with languages (translators, interpreters, sociolinguists, school teachers, …) and who has had an opportunity to visit “Africa” or interact with “Africans” (more on the scare quotes later), has told me in amazement that “Africans are naturally multilingual”.

I am deeply skeptical about any utterances that contain the word “naturally”, or “Africa/n” or “multilingual”, so imagine what a bummer it is for me to be confronted with these three words in one sentence, along with zero other redeeming content.

I suggest that we take it step by step and analyse this statement for what it is: a cliché which, like all clichés, also contains a kernel of truth. But that kernel is not necessarily where you think it is.

While the term “African” is sometimes used in a relevant way, it is most often a catch-all for a whole continent that is more diverse than this simplification suggests. So the first obvious problem with the above-mentioned statement is that it is unclear who these “Africans” are. Based on experience and precedent, I think it is quite safe to say that the people who start their sentence this way are not reminiscing about their last long week-end in Casablanca or their visit to the Pyramids in Gizeh. They are talking about “Sub-Saharan Africa”, i.e. “where black people come from”. This use of the term is of course widespread, including in African Studies, where people general focus on only that part of the continent (because hey, we are not doing Islamic or Middle Eastern Studies, which is where North Africa fits in…). International organizations speak of “Africa” and the “MENA” (Middle East and North Africa) region as two different entities as well, so including only sub-saharan Africa is not a problem per se. However, conflating “Africans” with “black people” is much more problematic: not all Africans are black and not all black people are African. The myth of multilingualism is, however, often applied to black people and their descendants, and often used as a gate-keeping mechanism

We all like to think of ourselves as “naturals” in one field or another. That is because we like to flatter ourselves and also (mainly!) because we lie to ourselves a lot. Most things that come “naturally” to us are the products of our socialization in a specific context, the result of a kind of learning that happens simply by virtue of existing in a given environment and often goes unnoticed by the learner herself. We internalize ideas about the world and our place in it and come to think of these as immovable features of the universe.

One of these ideas that each and every European in my generation (yes, myself included, absolutely!) has been exposed to simply by growing up in Europe and has internalized whether or not they are able to be honest about to to themselves is the inherent superiority of Europe, European culture and European civilization over all things “African”. And when a speaker who comes from that socialization tells me that Africans are “naturally” this, that or the other, then that word has a specific connotation that is problematic. Because on the one hand, “natural” means through no effort or higher processes of learning, through no structured quest or ambition, through nothing else than undeserved endowment from God or whatever else one worships. And on the other hand, “natural” also means that this is the way things are and that there is little one can do to change them, even if one wished to do so.

All in all, this is a lose lose situation for the “naturally multilingual African” – not only is her multilingualism not recognized as the intellectual accomplishment that it is, it is also something that is taken as a default feature of Africanness to the point that the absence of this feature is akin to a birth defect. Europeans, on the other hand, are expected to be monolingual by default (a lie, as we will see below) and any sign of multilingualism is thus “naturally” (see what I did there?) worthy of praise and recognition.

But what exactly does “multilingualism” mean in this context? What Europeans mean to say when they speak of “Africans” as “naturally multilingual” is that they understand that the language they used in order to communicate with the Africans they met is unlikely to be these individuals’ mother tongue. Thus, these people must speak another language. And because it is Africa we are talking about, that other language must be very, very, very different and very, very, very exotic, and very, very, very hard to learn. It can therefore only be spoken by those for whom it is “natural”.

This thinking frees the European from any pressure to engage with the local language and dispenses her from making even the slightest effort to learn it – and we know that there is hardly a European who comes back from a longer stay in Latin America without proudly showing off their Spanish, however rudimentary it might be. Another thing that is implicit here is that there is a hierarchy between languages. I do not think that there is an inherent qualitative difference between languages or that there are languages that are inherently more or less suitable to encapsulate the modern human experience. However, it is a fact that the opportunities that come with a language differ hugely from one language to another. English opens doors that simply cannot be opened with Gikuyu, Zulu or even Finnish, no matter how much one would like the opposite to be true. That is the reality of things.

The myth of African multilingualism, however, obscures the fact that there are still millions of Africans who are, in fact not multilingual in the common sens of the term: they speak only their mother tongue and barely a few words of the official language in their country. The politics, the education system and in many cases even courts and hospitals of their country remain out of reach for these individuals. The fact that the Africans Europeans interact with are often multilingual (because they have to speak the European language, duh) does not make this a universal “truth” about Africa.

Unless it does. I mentioned above that the term “multilingual” makes me queasy and that is because it implies that there is such a thing as a “monolingual” individual. I have never met one. Yes, there are people who master the elements of only one of the systems that we call “language” but even those individuals will speak very differently in different contexts, and leverage communicative resources that bear surprisingly little resemblance with each other. Is that not a form of multilingualism? Indeed, the statement about the multilingualism of Africans reveals the very problematic way in which many Europeans still look at language: as something with patterns and rules that must be learned, as different systems that co-exist with each other in a hierarchy and that are best kept apart and pure. And yet, the fact that we notice the most recent “anglicisms” when they crop up in German or French but consider yesterday’s Gallicisms in English as a normal part of the English language shows that purity is simply a matter of time. The time when languages used to be pure is roughly around the same time when America used to be great – and that time is not anywhere BC or AD but measured on a different scale: BS. So we can say that all Africans are multlingual, but only if we recognize that all human beings are actually multilingual and stop exoticizing and othering anything “African”.

And yet, the true reason Europeans cannot but notice the multilingualism of many sub-saharan African cities and towns is that people constantly switch and even mix (gasp!) languages and that this mixing is not generally frowned upon. So people are multilingual in one and the same sentence – and once again, like all things “African” – that surely cannot be the right way to be multilingual. But probably it is the natural way (this is true) but then again culture is specifically there to preserve us from nature.

It probably does not help that a surprising number of Europeans who travel to Africa are primary and secondary school teachers using their vacation time, which is much longer than for any other profession and thus allows for more extensive traveling, to do some volunteer teaching down South. After weeks of leading an uphill battle against groups of rowdy school children who are unwilling to do anything other than repeat full sentences uttered by the teacher in English or French, and who invariably switch back to another language during breaks, the only thoroughly positive and uplifting thing these teachers find to say when they come back is: “Africans are naturally multilingual.”

Bless their hearts, they mean well, I know they do.

*The attentive reader may now complain and say that the title of this post is deeply misleading. I have not told you much about the multilingual Africans I was advertising, just about the monolingual Europeans that describe them. Point taken. But would you have read a post about European multilinguals? Were you not “naturally” curious to learn more about the exotic African multilingualism?

#MemorableMultlinguals: the bilingual on the podium

The name does not matter because, if you are an interpreter or regularly participate in multilingual meetings, you have probably met one version or another of this person in the course of your career. When I think about them, I think about a guy because the majority of people on podiums still tend to be men and maybe also because men are often more eager to venture outside their area of competence. So for the sake of readability, let’s call this multilingual individual Peter but don’t get attached to the name because what matters is ultimately not him but the context that allows for someone like Peter to emerge.

My last encounter with Peter occurred during a bilingual meeting, where I was tasked with interpreting between German and French. As tends to be the case in Switzerland, the overwhelming majority of attendees were German speakers, and French speakers a tiny minority. The language distribution on the podium was even more skewed, since the first language of basically everyone up there was German. The interactivity of the meeting was low, i.e. most participants were not planning or expecting to intervene and had made the journey merely to receive information from their board and vote on different issues by show of hands. From an interpreting perspective, the linguistic setup was thus extremely imbalanced, more than 90% of utterances would have to be translated from German into French, and it was unclear what the distribution for the remaining 10% would be. Peter and his colleagues were sitting on the podium, ready to present an annual report about their different areas of expertise. Our French-speaking clients were sitting in their seats, clutching their headphones in the understanding that they would have to follow the entire meeting through their interpreters. This is where things get interesting.

The minoritized French speakers were very much aware that this was a multilingual meeting with interpretation. That awareness comes with being a minority and losing your communicative independence. The majority German speakers were, however, getting ready to attend a monolingual meeting. Barely any of them carried headphones to their seats, they took part in the meeting with the certainty of those who know that they will understand everything because that is just how the world works. That certainty, however, was shattered when a French speaker unexpectedly decided to take the floor and ask a question. This question was, of course, interpreted simultaneously into German, since that is the job we were recruited to do that day. However, we realized quickly that we were interpreting into the void given that none of the German speakers actually wore headphones, and just exchanged blank stares in horror, realizing all of a sudden that this was actually not a monolingual meeting at all.

Fortunately, Peter came to the rescue, taking the floor from the podium to hastily improvise a summary of the French speaker’s question in German. From an interpreting standpoint, the summary was neither complete nor particularly accurate. The main point the speaker had been trying to make fell flat. But balance had been restored, the German speakers had once again regained control over the situation. Not a single German speaking delegate got up to pick up headphones at the entrance of the room after this incident. They simply had not understood that the interpreters had also translated that part of the meeting, since the whole point of the interpreting provision was to cater for the (special?) needs of the minority.

To take on this task, Peter had to have an understanding of both the minority and the majority language, although he did not necessarily have to be fluent in both. Peters exist everywhere. Peters are a product of power asymmetries between groups of speakers. They exist because implicitly or explicitly, many speakers of the dominant language, whether English in international conferences or German in Switzerland, see interpretation as necessary to get their own message across, but not to hear the messages of the minority. They are surprised when put in a situation where they do not understand another speaker, used to being understood and heard wherever they go.

Peter’s presence points us to the limits of interpretation, and reminds me of what Bourdieu wrote nearly 30 years ago about “legitimate” linguistic competence: being able to make oneself understood is not the same as being able to make oneself heard. A message presented in the “wrong” language might be understood, yet not treated with the same care and not met with the same respect as a message presented in the “right” language. Bourdieu’s argument relates to speakers of the same language whose speech patterns (vocabulary, accent, prosody) do not have the same level of legitimacy, however, his thinking can be applied to multilingual settings as well. By jumping in to provide a consecutive summary, the resident bilingual ensures that a message can potentially be understood (or at least noticed) but this approach also signals to the speaker that their intervention is disruptive and amidst the commotion thus created, very unlikely to be heard.

While solving a communication problem in the short run, these bilinguals ultimately allow for a much bigger communication gap to continue unchallenged and for the majority language speakers to participate in what for them is essentially a monolingual meeting.

Professional interpreters might convince themselves that they see their role as making sure that a message uttered in one language is “understood” in the other language because that is what the principle of impartiality seems to dictate. However, I suspect that just like me, many colleagues have felt frustration or even mild anger when a delegate speaking “their” language makes a highly relevant point that is completely ignored by the other people in the room. So I guess that what we really want is for these messages to be heard, and when this is not the case, we feel poorly about our own performance and the relevance of our contribution.

It’s not Peter’s fault, really. He means well.

But we can probably do a better job of making clients aware of the consequences of his approach, so that next time, he can use his platform to gently remind everyone in the room to just wear their bloody headphones and select the correct channel in advance. So that for once, the burden is on the speakers of the dominant language.


Bourdieu, Pierre. 1991. Language and Symbolic Power. Cambridge, UK: Polity Press.

Why “Publish or Perish” is bad advice

Publish or perish sounds snappy and rings true, which is why we really need to ask ourselves whether it actually is. It is a phrase used by journalists and commentators to describe the current state of academia, and also passed on as advice from senior academics to their younger colleagues, and from junior academics to their peers. The argument that I will develop in this post is that it is not very good at either. Publish or perish is not in any way an accurate description of academia, nor is it sound advice for academics.

In fact, publish or perish is a meme that keeps many researchers stuck in what is inherently an abusive relationship with a system that gives them an illusion of agency that is just good enough to make them hang on.


Let me get the obvious one out of the way first: publish or perish has nothing to say about publication quality and instead seems to emphasize on quantity. After all, a high-quality publication takes time, sometimes years, and you are supposed to be publishing all the time. In many institutions there are formal or less formal publication targets and full time academic staff are expected to produce around 2 to 3 articles a year. This sounds like not much, ultimately, but it generally boils down to writing one high-level and several lower-level papers, or artificially splitting data sets from a single project into several subsets that can be published in separate papers. In many fields this has also led to a proliferation of second- and third-tier journals and an abundance of frankly rather mediocre articles. It also rewards academics for publishing basically anything, and a publication strategy that is based on writing few but very good publications almost looks like an act of resistance.

The way in which academic CVs are usually evaluated frankly does not help. Any prospective employer or funding body will argue that they will above all look at the quality of publications, not their quantity. But let’s be honest, they will not actually read your papers to see whether they are of good quality, they will use the impact factor of the journals you published in as a proxy for quality and that is deeply problematic. Not because the first-tier journals don’t publish quality – most of the time they do – but given the abundance of papers they receive (some journals reject over 95% of submissions), some excellent papers necessarily end up in the rejection pile, simply because they don’t fit with the stated aims of the journal or the preferences and interests of its editors. In addition, there are ways to get into high-level journals that might otherwise reject your paper, for example by applying for a Special Issue that is guest edited and comes with a pre-selected set of papers on a given theme. These papers will be published in the same journal, many authors might not mention “Special Issue” on their CV, and unless someone really takes the time to dig deeper, the impact factor of the journal is now on that author’s CV.

In addition, there are countless other variables that publish or perish fails to account for: different disciplines have different sizes, impact factors vary widely from field to field, editors and reviewers are only human and their decisions not always entirely fair or objective, and let me not get started on the politics of co-authorship and the order of authors on a paper and what that paper will then be “worth” on each of their CVs.

This is not to say that publishing is not good advice. I am infinitely grateful to those who encouraged me to just start publishing, even when I did not feel I had a legitimate voice within the discipline. Waiting until you feel that you have something important to say is not good advice – no discipline will accept a fundamental theoretical insight from someone who is completely unknown among her peers because she has never published a line of text before.

The problem with “publish or perish” is that it simplifies things to a fake binary distinction and glosses over the complexities that inhabit each of these three words. Speaking of binary systems…


Computers rely on different types of basic logic gates to establish relationships between two inputs, A and B. We can view these inputs as different conditions that can each either be met (1 – true ) or not met (0 – false), and that can furthermore interact in several different ways. These interactions are generally illustrated by a matrix as follows:

true true
true false
false true
false false

Each gate “opens”, i.e. turns a 0 into a 1, when there is a specific relationship between A and B.

  1. AND gates: A and B, have to be simultaneously true
  2. NAND gates: either one or both of the inputs, have to be false for the gate to open – the NAND gate is the opposite of the AND gate
  3. OR gates: either one or both of the inputs are true, i.e. condition A or B is met or condition A and B are met
  4. XOR gates: this gate is a true “or” condition, i.e. it opens only when inputs A or B are true not when both are true or when both are false.

So what about publish or perish? The linking word parades as an “OR” but is actually an “XOR” gate, creating a binary opposition between two conditions that cannot simultaneously be true: you publish (A) or you perish (B). Several implications can be derived from this initial statement, and all of them are, to put it mildly, pretty much total bullshit:

If you don’t publish, you will perish.
If you publish, you will not perish.
If you did not perish, it is because you published.
If you perish, it is because you did not publish.

These statements arguably all sound much less snappy than “publish or perish”, which is exactly why it became a meme that is passed from person to person and effectively circumvents our critical reasoning. It sounds right and that’s about it. But the “if statements” above show publish or perish for what it is: a shortcut that establishes a direct correlation where none exists.

This is not to say that publishing is not a necessary requirement to attain legitimacy in the academic field. It very much is. There are people who have achieved tenure despite a poor publication record but they are the exception and not the rule, often owing their early tenure to largely arbitrary lucky circumstances, like good timing, a good network within their institution or discipline, and the retirement of a professor in their field shortly after they obtain their PhD. These are not things one should ever count on or plan for so publishing is still better than not publishing. For each of these success stories of early tenure, I have heard at least two stories of such early tenure being expected and then prevented by the arrival of a better qualified candidate. So while entitlement is never good advice, it is healthy to keep in mind that luck and randomness also play a role in all of this, especially when professorships are awarded “for life” and retirements end up skipping several generations of academics altogether (who were academically too young when a post became vacant and are biologically too old when it becomes vacant again). This means that many people will simply not be eligible to apply for certain posts because of bad timing.

But even if publishing is necessary, it is not a sufficient requirement to get promoted, tenured or even just extended and this is one aspect we tend to regularly forget or conveniently deny. Many sharp minds have left academia despite a solid publication record, simply because the number of academics far outstrips the number of available posts, scholarships and stipends. That is the reality of things. The statistics are murky and hard to come by, but it is safe to say that only a minority of those currently trying to obtain their PhD will remain in academia upon graduating, and only a minority of those currently employed as post-doctoral researchers will get long-term contracts or tenure. In the US there are now as many Ph.D.s working in the private sector as in academia, and that number includes all generations of academics and non-academics currently in employment, which means that the proportion for younger generations is likely much higher.

Many of those have left academia by choice, in pursuit of higher salaries, better working conditions and more stability. Others have left academia with a heavy heart, simply because they have reached the conclusion that the field has no place for them. Some of them probably did not have a good publication record. But I would bet that, on average, they probably had about as many publications as their peers when they left academia. They perished even though they published.

There really is no logic gate linking publishing to perishing. You can publish and perish, not publish and not perish, publish and not perish, and not publish and perish. Not perishing in academia is as much about competence as it is about luck, networking and randomness. Ask those who spent the year 2019 putting together funding applications about Corona viruses or pandemics, thinking that they would once again get rejection after rejection because their research was not considered topical enough…

It is not flattering to think about our professional successes as owed in large part to randomness and we therefore don’t. We try to tell ourselves tales of competence and merit. But the truth is, for every person who holds a PhD, there exist thousands of other people with equally (or more) brilliant minds who never got a chance to engage in higher education. Our social positioning is the result of a complex web of factors and we only have a limited amount of control over a limited number of them.

It is easy to think of a system that puts you on top as a meritocracy. That does not make it true.


While some ‘doctors’ are working in the private sector, others have decided to continue to hang on to highly precarious academic ‘posts’ that are often nothing more than exploitative makeshift arrangements where you are paid to teach and ‘allowed’ to use the institution’s name as an affiliation for the research you publish in your own time and without pay. They are still in the running to get tenure. They have not “perished”.

The third word in our little meme is by far the most toxic. It encourages academics to remain in a system that can be exploitative and abusive by depicting the alternative as inherently worse. It comes from the same brand of reasoning that encourages women to stay in abusive relationships, and justifies gender-based violence as inevitable. Yes, really. Hyperbolic, much? Probably a bit. On a meta-level. Just to drive home the point that leaving academia and dying are two very different things. Leaving academia is an individual decision, or sometimes simply the result of circumstances beyond our control. It is not a form of “giving up”. You are not leaving academia because of an inability to fight hard enough to stay, you are leaving because you decided that you now want to fight a different battle altogether. And that is fine.

Many professional fields that apply rigorous entrance requirements – both academia and conference interpreting come to mind here – end up exerting a cult-like pull on their members. Leaving the field is viewed by its members almost as an act of treason. The parallels between conference interpreting and academia are quite staggering here. In both cases, people who leave the field are seen as intellectually lazy or not hard-working enough, in line with the myth of meritocracy that members tell themselves to allow the field to self-perpetuate with all its inequalities. And in both cases, the people who occupy entry-level positions in the field (recent graduates in interpreting, doctoral and post-doctoral researchers in academia) are the ones most actively questioning the rules of the game, making everyone else extremely uncomfortable in the process. Especially, I shall add, those who still want to believe in the ideal of a meritocracy and have been shielded from the limitations of their own agency-centric world view by a hefty dose of privilege.

The aura of meritocracy, together with a mismatch of hopeful candidates and available positions, do, after all, contribute to giving the field an aura of exclusivity, desirability, and importance, which all further enhance the symbolic capital of those occupying positions of power within it. Everyone else is expendable. Your struggle is not a bug, it is a feature.

Bottom line

If like me you enjoy research and writing: please continue publishing. Develop a publication strategy that suits your personality and your situation. But publish what you find relevant. Persevere to get your message out there, to be part of a discussion that you really care about. Enjoy the ride for its own sake.

However, do not publish merely to get promoted or tenured, to “not perish”. Because when that is the primary aim guiding your publication game, the time invested will not be time enjoyed but time stolen from yourself, your family and your friends. That time is not coming back.

The correlation between publishing and not perishing is spurious (and the internet has its very entertaining rabbit hole of those to go down on a rainy day) and the return on investment might therefore be disappointing. The only reward you might get for a publication is the process in itself and how it has contributed to your intellectual growth. It sounds cheesy and not at all snappy, but it is true and in itself an enormous privilege in today’s troubled times. Don’t mess it up by writing about stuff that you only marginally care about, just because you think it will get you somewhere professionally. Or do – I am not judging you, really. I am just trying to be mindful of what I spend my own time on.

Then again, don’t take advice from someone who has just spent a lot of time writing a blog post that has zero value on her academic CV.

Histoire d’un retour au pays natal

Walking through Paris this morning I came across an Islamic funeral service. Given the number of Muslims in France you might wonder what is the big deal. Nothing, really, except that it made me think about something that has been on my mind for some time: death and exile.

Migrants, in particular the generation that has experienced transnational relocation first-hand, are often asked if or when they are planning to return to their country of origin. This question tends to produce awkward silence, apologetic shrugs, a hesitant “no” or an avalanche of explanations as to why exile is inevitable. In most cases the question is unwelcome, sometimes met with outright hostility. The question of returning home is taboo for several reasons.

First, it is often asked from within a general mindset that rejects migration, strives to make it temporary and refuses migrants any pathway towards becoming full members of their new society. A general suspicion towards those eager to ask this question whenever they see someone looking ostensibly “exotic” is therefore fully warranted.

Yet even when the question is raised among friends, and without xenophobic undertones, things can get awkward real quick. For starters, there are of course those for who left their country under violent and traumatic circumstances, and for whom the impossibility of return is an open wound that is better not stoked. However, for the overwhelming majority of migrants the question of return awakes much less dramatic memories. Rather, it breaks a taboo in that it forces them to think about something that people in general prefer to suppress, namely that we are not really individuals but merely nodes in a complex network of relationships, dependencies and reciprocal exchanges. In other words, once you leave your country “life happens” and life keeps happening. You might marry someone from a different location, your children might be born in your country of residence or identify strongly with it, and a return to the homeland has become as impossible as a return to the past.

I have heard many migrants talk about returning “home” once they retire, and yet, once their grandchildren are born in their country of residence, they remain, forever postponing the intended relocation. What matters is not so much the implementation of the idea but the idea itself, the knowledge that one ‘could’ (even though, for all practical purposes one cannot) go back one day. This idea is deeply ingrained in the identity of any migrant.

“Why does it matter?” you might wonder. “If they are not going back then why not just say so? Is this another one of those fluffy emotional things?”

Well… let’s say that this limbo is a defining feature of the identity of anyone who lives far from things and people they love. Many people think of their childhood and youth with nostalgia, wishing they could return to this time in their lives. Yet for migrants, time and space intersect – while the impossibility of a return to one’s childhood is immediately obvious to anyone (although, truly speaking, human beings will surprise you…), the return to the homeland is in theory possible. Our childhood is gone, yet our homeland continues to exist. This makes the thought experiment appealing and reassuring, and many migrants build a mental sanctuary around this idea. Asking the “question” is a way of stumbling into that sanctuary and ignoring that its entrance is riddled with signs saying “Keep out!”, “Beware of dog”, “Authorized personnel only” and “Do not enter!”.

So how can you know whether or not someone has fully made peace with the idea of not returning home? Some questions that are left in limbo in life can only be resolved in death. The return to the homeland is one of these. Indeed, there is one set of questions that makes migrants much more uncomfortable than the one alluded to above because it breaks the state of limbo, violates the mental sanctuary and touches something foundational. Variations of the question include “Are you planning to die here?” or “So, where do you want to be buried?”. Speak of a dampener during a casual dinner conversation. And yet, with close family this question sometimes comes up, and the response can tear generations and couples apart.

For many migrants, living abroad is perfectly acceptable, yet dying abroad remains taboo. This red line seems to exist for migrants of all ages, whether religious, agnostic or atheist. For some it is just a matter of being buried ‘at home’, others even wish to die in their homeland. For bi-national couples, or the children of immigrants the question of death – arguably an uncomfortable question for anyone – carries the additional weight of separation. “Till death do us part” takes on a stark meaning when you realize that, in death, your closest family (father, mother, spouse, siblings and children) might be scattered over three continents.

This is where the islamic funeral service in Paris comes in. While a lot of their work seems to consist of repatriating bodies to their respective countries of origin, they also organize funerals in France. This is significant because Islam is still so closely associated with migration, and many Muslims in France have expressed a feeling of alienation and not-belonging despite being French citizens. Indeed, getting a passport might make you a citizen, but you only ever stop being a migrant when you are buried in a country. And a country only ever really welcomes you once it offers you the possibility to be buried there according to your customs.