top of page
  • Writer's pictureHabiba Salaheldin

Robotic shades of (baby) blues: Can Social Assistive Robots combat Postpartum Depression in Egypt?

Anxiety, melancholy, and a loss of joy in life. Many new mothers in Egypt suffer from symptoms of postpartum depression (PPD), a disorder that affects a staggering 63.3% of mothers worldwide.

PPD, or ‘baby blues’, also causes insomnia, a loss of appetite, and an inability to accomplish daily tasks. This can impair the mother's capacity to care for the infant, which can have a detrimental impact on the child's growth and development. Both mother and child’s well-being are important, and such a hindrance cannot be risked. Can technology help alleviate PPD for new mothers? Well, with the help of Social Assistive Robots (SARs), it looks like it can.

Illustration by Nour Ahmed

There’s a limited amount of literature dedicated to PPD and SARs in the MENA region. In this article, I will open us up to the topic of SARs in healthcare, but also address its intersection with PPD and how it can mitigate the aforementioned stressors. To bridge the literature gap, we’ve conducted interviews as well as desk research to further gauge the effect of SARs on PPD in Egyptian mothers.

The guiding question of my research, albeit simple, proved to be compelling: In what ways can emerging technologies, such as social robots, assist in healthcare, and would new mothers in Egypt be open to it?

Emerging technologies such as SARs have been deemed as potentially useful in the field of healthcare for both patients and healthcare personnel. Some examples of robotic systems that would be of great use for patients with PPD include Pepper and Ryan, manufactured by SoftBank Robotics and Prof. Mohammed Mahoor from University of Denver, respectively.

The robotic system, Pepper, is a humanoid robot that can communicate through touch, but also through natural language communication. Pepper is able to interpret (but also express) facial expressions and the tonality of emotion. This means that Pepper does not only have the ability to learn through algorithms and machine learning, but it can also learn from confronting its human users and adapting to their needs (Seifert et al. 2022). It can be used to monitor the emotions and mental state of the mothers, provide immediate feedback, set doctor appointments, or even video call family and friends!

In the case that the mother’s PPD requires therapeutic assistance, then this is a great time to talk about Ryan. Ryan is also a humanoid robot, but the difference between Ryan and Pepper is that Ryan is certified to provide internet-delivered cognitive behavioural therapy (iCBT) (Dino et al. 2019). CBT has been shown to greatly decrease symptoms of depression and anxiety, as well as increase the self perceived maternal-efficacy when used to treat patients with PPD (Branquinho et al. 2022).

SARs and Egyptian Mothers

The following is an exploration of the perception and potential impact of SARs in their capacity to act as care-surrogates for struggling Egyptian mothers.

To assess the potential impact of SARs on new mothers in Egypt, I have spoken to a diverse group of new Egyptian mothers. The majority of the interviewed mothers asserted that despite the support of friends and family, they felt alone and helpless. When they opened up about their feelings of depression, they were assured that their acute despair was typical. Despite being told that there's nothing to worry about, they felt as though they were greatly misinformed about what to mentally expect when caring for a child.

While conducting interviews, I was surprised by the great positivity towards the idea of SARs. Especially because it contradicted the results of a study conducted towards the perception of SARs in healthcare in the MENA region. The study showed hesitancy and low acceptance rates towards SARs being deployed in healthcare and education (Mavridis et al. 2012). While the output of my field research is admittedly not a conclusive representation of SARs and its intersection with PPD in Egypt, since it didn't use a large enough set of mothers representative of a whole nation’s demographics, the perspectives are undoubtedly worth investigating.

In what follows, I explain why despite it not being a representative sample of society, in that it does not cut across racial and socio-economic cleavages, the interviewed sample offered valuable insights into the intersection between PPD and SARs in Egypt.

Firstly, none of the mothers thought that SARs would hinder their experience of motherhood; in fact, there was consensus that the extra help would be welcomed. They explained that SARs would be an unbiased source of comfort, capable of listening to their requests and fears, while also offering medically needed assistance. Because the carebots are objective and unbiased, they would allow the mothers to freely express their unfiltered feelings without judgment. This, along with iCBT, would help serve as a point of reflection for the mother to analyse what she genuinely requires. The mothers were also adamant that the carebots would free up a lot of time. They stated that having little time to themselves, for themselves, would have greatly helped their experience with PPD.

Despite mothers’ widespread support for carebots, some of them felt they were simply unnecessary, claiming that the problem was not one requiring external support. They had all the outside help they could want. Rather, they felt misled by the healthcare system due to the lack of acknowledgement of PPD and stated that they thought the magnitude of their emotions could not be articulated verbally, but only felt. Something a carebot is simply incapable of comprehending as it lacks compassion and “that warm touch”, as one interviewee explained.

But what if SARs could understand the nuance of emotion? As we know, empathic care largely improves health outcomes (Seifert et al. 2022). But do Pepper and Ryan, for instance, display the capacity for affect?

In the case of Pepper, it has sensors that detect and interpret facial expressions and gestures. It matches it with the input of information and can then further learn and recognize how its human counterpart is feeling.

With Ryan, she takes in a lot of information that is heavily emotion-based. She would need to have some emotional understanding and capacity to gauge the weight of this input in order to provide adequate iCBT help.

Privacy and Data Protection

However, with SARs having access to this plethora of intimate information, it leaves us wondering about the ethical repercussions of it all.

Explicability and privacy are important. Which is why when I interviewed the mothers, I asked them about how they feel regarding the issue of the privacy of their data. They indicated that they would prefer it if they had the option to review and control what information is recorded. They displayed some concern as to how this information would be used and where it would be sent. They understand that some information is necessary in the case of medical assistance, but with the large amount of information being gathered, it can’t all be necessary for review, they thought. Therefore, transparency regarding what information is being recorded and how it is being used is necessary.

Another ethical concern revolves around the idea of para-social relationships. One of the main ideas behind the development of SARs is that they would maintain the social and medical needs of the people they are assisting (Fong et al., 2003). Pepper’s capabilities, for example, such as being able to perceive and interact with emotional cues, can create attachment between robot and patient. This would then evolve into the creation of a structured and seemingly real relationship and would replace natural forms of communication and connection (Schiappa et al., 2007). This is particularly concerning as it may negatively affect human communication, emotional trust, and may end up reducing empathy due to the creation of dependency towards a unidirectional para-social relationship (Darling, 2016; Glickson & Woolley, 2020).

Can Negative Impacts Be Avoided?

With the proper amount of transparency and explicability, it can be largely avoided. In general, the risks vary depending on the type of care required as well as the mental disorder the patient suffers from. By extension, the ways in which these risks can be mitigated also vary.

For example, to ensure the responsible development and deployment of ethical AI, special care and consideration need to be taken into account. Especially when SARs are used with vulnerable and sensitive populations, such as those who are suffering from dementia or autism (Shamsuddin et al., 2012; Calo et al., 2011). However, with regards to PPD, these risks are fairly miniscule: SARs would be mostly used as a mental aid and a screening/monitoring tool for the mental and physical well-being of the mother. Some para-social relationship risks still apply when the mother is using SARs as a therapeutic tool or as an iCBT tool. But with proper precautions and knowledge about how SARs uses information and mothers’ acknowledgement that they are strictly interacting with a humanoid robot, there would be enough awareness to prevent disillusionment.

It’s important to also discuss that SARs are not a replacement for humans. SARs are simple tools designed to aid and complement human practitioners. If anything, AI could become a source of encouragement for empathy in the healthcare system, and become an additional source of comfort to those who simply need some support and company. As a result, healthcare systems would also be able to reach and aid patients regardless of when and where.

In conclusion, the integration of SARs in PPD care has the potential to positively impact the well-being of new mothers in Egypt, offering personalized support, therapy, and companionship. Further research and exploration in this field are needed to fully understand the benefits, limitations, and ethical implications of SARs in addressing PPD. By embracing technological advancements in healthcare, we can strive to improve the lives of new mothers and ensure the well-being of both mother and child.



Branquinho, M., Canavarro, M. C., & Fonseca, A. (2022). A Blended Cognitive–Behavioral Intervention for the Treatment of Postpartum Depression: A Case Study. Clinical Case Studies, 21(5), 438–456.

Calo, C. J., Hunt-Bull, N., Lewis, L., & Metzler, T. (2011, August). Ethical implications of using the paro robot, with a focus on dementia patient care. In Workshops at the Twenty-Fifth AAAI Conference on Artificial Intelligence.

Darling, K. (2016). Extending legal protection to social robots: The effects of anthropomorphism, empathy, and violent behavior towards robotic objects. In Robot law. Edward Elgar Publishing

F. Dino, R. Zandie, H. Abdollahi, S. Schoeder and M. H. Mahoor, "Delivering Cognitive Behavioral Therapy Using A Conversational Social Robot," 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 2019, pp. 2089-2095, doi: 10.1109/IROS40897.2019.8968576.

Fong, T., Nourbakhsh, I., & Dautenhahn, K. (2003). A survey of socially interactive robots.Robotics and autonomous systems, 42(3-4), 143-166.

Glikson, E., & Woolley, A. W. (2020). Human trust in artificial intelligence: Review of empirical research. Academy of Management Annals, 14(2), 627-660.

Mavridis, N., Katsaiti, MS., Naef, S. et al. Opinions and attitudes toward humanoid robots in the Middle East. AI & Soc 27, 517–534 (2012).

Schiappa, E., Allen, M., & Gregg, P. B. (2007). Parasocial relationships and television: A meta-analysis of the effects. Mass media effects research: Advances through meta-analysis, 301- 314.

Seifert, J., Friedrich, O. & Schleidgen, S. Imitating the Human. New Human–Machine Interactions in Social Robots. Nanoethics 16, 181–192 (2022).

Shamsuddin, S., Yussof, H., Ismail, L., Hanapiah, F. A., Mohamed, S., Piah, H. A., & Zahari, N. I. (2012, March). Initial response of autistic children in human-robot interaction therapy with humanoid robot NAO. In 2012 IEEE 8th International Colloquium on Signal Processing and its Applications (pp. 188-193). IEEE.


bottom of page