Overview

  • Posted Jobs 0
  • Viewed 3

Company Description

AI Simulation Gives People a Glimpse of Their Potential Future Self

In an initial user study, the scientists found that after interacting with Future You for about half an hour, individuals reported reduced anxiety and felt a stronger sense of connection with their future selves.

“We do not have an actual time machine yet, but AI can be a kind of virtual time device. We can use this simulation to help people think more about the effects of the options they are making today,” says Pat Pataranutaporn, a current Media Lab doctoral graduate who is actively establishing a program to advance human-AI interaction research study at MIT, and co-lead author of a paper on Future You.

Pataranutaporn is signed up with on the paper by co-lead authors Kavin Winson, a researcher at KASIKORN Labs; and Peggy Yin, a Harvard University undergraduate; along with Auttasak Lapapirojn and Pichayoot Ouppaphan of KASIKORN Labs; and senior authors Monchai Lertsutthiwong, head of AI research at the KASIKORN Business-Technology Group; Pattie Maes, the Germeshausen Professor of Media, Arts, and Sciences and head of the Fluid Interfaces group at MIT, and Hal Hershfield, teacher of marketing, behavioral decision making, and psychology at the University of California at Los Angeles. The research study will be provided at the IEEE Conference on Frontiers in Education.

A sensible simulation

Studies about conceiving one’s future self go back to at least the 1960s. One early method targeted at enhancing future self-continuity had people write letters to their future selves. More recently, scientists utilized virtual reality goggles to help individuals imagine future versions of themselves.

But none of these approaches were really interactive, restricting the impact they might have on a user.

With the development of generative AI and large language designs like ChatGPT, the scientists saw a chance to make a simulated future self that might talk about somebody’s real goals and aspirations throughout a regular conversation.

“The system makes the simulation really reasonable. Future You is much more comprehensive than what an individual might create by just imagining their future selves,” states Maes.

Users begin by responding to a series of concerns about their existing lives, things that are important to them, and objectives for the future.

The AI system uses this information to produce what the researchers call “future self memories” which offer a backstory the design pulls from when engaging with the user.

For example, the chatbot might talk about the highlights of someone’s future career or answer questions about how the user conquered a specific challenge. This is possible because ChatGPT has been trained on extensive data including people discussing their lives, careers, and excellent and disappointments.

The user engages with the tool in 2 ways: through introspection, when they consider their life and goals as they build their future selves, and revision, when they ponder whether the simulation reflects who they see themselves ending up being, states Yin.

“You can picture Future You as a story search space. You have a possibility to hear how a few of your experiences, which may still be emotionally charged for you now, could be metabolized throughout time,” she says.

To assist individuals imagine their future selves, the system produces an age-progressed image of the user. The chatbot is likewise designed to offer vibrant answers using phrases like “when I was your age,” so the simulation feels more like a real future variation of the individual.

The ability to listen from an older variation of oneself, instead of a generic AI, can have a stronger favorable effect on a user considering an unpredictable future, Hershfield says.

“The interactive, vivid elements of the platform give the user an anchor point and take something that might lead to nervous rumination and make it more concrete and productive,” he adds.

But that realism might backfire if the simulation moves in an unfavorable instructions. To prevent this, they guarantee Future You cautions users that it reveals just one possible version of their future self, and they have the company to alter their lives. Providing alternate responses to the survey yields a completely various discussion.

“This is not a prophesy, however rather a possibility,” says.

Aiding self-development

To assess Future You, they carried out a user research study with 344 people. Some users connected with the system for 10-30 minutes, while others either communicated with a generic chatbot or just submitted surveys.

Participants who used Future You had the ability to construct a better relationship with their ideal future selves, based upon a statistical analysis of their actions. These users also reported less anxiety about the future after their interactions. In addition, Future You users stated the conversation felt sincere which their values and beliefs seemed consistent in their simulated future identities.

“This work creates a new course by taking a reputable psychological technique to imagine times to come – an avatar of the future self – with cutting edge AI. This is exactly the kind of work academics should be focusing on as technology to build virtual self models combines with large language designs,” states Jeremy Bailenson, the Thomas More Storke Professor of Communication at Stanford University, who was not included with this research study.

Building off the outcomes of this initial user research study, the scientists continue to fine-tune the methods they develop context and prime users so they have conversations that help build a stronger sense of future self-continuity.

“We wish to direct the user to speak about specific topics, instead of asking their future selves who the next president will be,” Pataranutaporn states.

They are also including safeguards to avoid individuals from misusing the system. For circumstances, one could think of a company developing a “future you” of a prospective consumer who accomplishes some excellent result in life since they acquired a specific product.

Moving on, the researchers wish to study specific applications of Future You, maybe by allowing people to check out different professions or envision how their everyday options could impact climate modification.

They are also gathering information from the Future You pilot to better understand how individuals use the system.

“We do not want individuals to become based on this tool. Rather, we hope it is a meaningful experience that assists them see themselves and the world differently, and aids with self-development,” Maes says.