Skip to main content

The AI Buddy Project is building an assistant to support kids of military families

This is a photo from The AI Buddy Project video
Image Credit: Vidax Center

Join us in Atlanta on April 10th and explore the landscape of security workforce. We will explore the vision, benefits, and use cases of AI for security teams. Request an invite here.


If a soldier dies in war, can a virtual assistant play a role in their kid’s life to listen, encourage, and be a friend in hard times? How long should the assistant be an active part of that child’s life: A few weeks? Years?

These are questions being considered by The AI Buddy Project, an effort recently launched to create a virtual assistant for the children of active duty military servicemembers deployed overseas.

AI Buddy is being designed to read bedtime stories, play games, help with homework, and chat with kids to gather insights into their emotional well-being. Regular progress reports are then shared with the child’s family so they can tell how the kid is coping or if they’re sliding into depression.

AI Buddy will take into consideration the length of a soldier’s deployment and is even being made to help if a parent is killed in action, WeBelievers cofounder Marco Vega told VentureBeat in a phone interview. The bot is able to ask roughly 150 questions, such as “How are you feeling today?” but after the death of a parent, AI Buddy’s primary focus will be to listen.

VB Event

The AI Impact Tour – Atlanta

Continuing our tour, we’re headed to Atlanta for the AI Impact Tour stop on April 10th. This exclusive, invite-only event, in partnership with Microsoft, will feature discussions on how generative AI is transforming the security workforce. Space is limited, so request an invite today.
Request an invite

“I think that if there’s one thing that we’ve learned, it’s that one of the worst things you can tell someone who is mourning a loss is ‘I know exactly how you feel,'” Vega said. “You just want to make sure that the child is not locking themselves out of the world, and usually the best way to do that is by opening up a channel to listen.”

The AI Buddy Project brings together Vidax Center, an Argentina nonprofit focused on child trauma, with creative agencies Clowder Tank and WeBelievers. A project spokesperson declined to share details about specific medical professionals playing a role in the project, but they said psychologists and pediatricians are among the participants.

Social media and text messages from family can be scanned to give the assistant a familiar voice and sound and make the character “a virtual member of the family,” according to a pitch video. Emotional well-being will be scanned with natural language processing made by The AI Buddy Project.

Instead of a single bot, app, or website where kids can interact with AI Buddy, the project wants to partner with platforms kids use to meet the child where they are instead of the child having to come to it.

Currently in alpha, the AI Buddy Project plans to release a beta version for more public use within the next 12 months. Meanwhile, the AI Buddy Project will continue to refine its corpus of questions and examine its own research and thinking to ensure that “the promise of what we’re building doesn’t open doors we didn’t want to open.”

For example, at first, the project planned to create “an AI mommy and daddy,” based on the way they talk and word usage, but chose instead to depict the assistant as a character like a monkey or teddy bear.

“We felt that the ethical and moral implications of actually creating that mind clone would far have been more negative than the possibility of creating just a buddy that takes some of the moral values and the essence of the family,” Vega said. “If you were to create the mind clone of the mom or the dad, that thing cannot live forever. You would have to shut it down at some point, right? I mean you would have to determine: Does it die when that mind clone gets to 75 years of age, or when the child gets married and has kids?”

The idea of keeping an AI Buddy for years evokes digital ghosts — NLP made with the essence of a human being that can survive and continue to mimic a person even after they’re dead. Digital ghosts are part of the work done by the startup Luka, as well as archivists at University of Southern California who captured holographic memoirs of Holocaust survivors you can interact with.

Vega said the project chose to begin with the children of deployed servicemembers because they are an often overlooked segment of those negatively affected by war. According to a recent study, one in four children of military servicemembers have thought about suicide — twice the national average.

The project will begin with the children of military servicemenbers and later extended to help other children hurt by war, like young refugees.

“Anything that creates trauma around a child, there could be a role for artificial intelligence to step in if it’s designed properly,” Vega said.

VB Daily - get the latest in your inbox

Thanks for subscribing. Check out more VB newsletters here.

An error occured.