CoPilot Designer has an exciting assortment of new features in version 2.6! This update includes capabilities to further empower learning content creators to design the immersive learning content employees need. The latest release includes additional language support, flexible spawn-points for users and virtual humans, and a customizable results screen. Let’s dive in.
Blog Contents:
- ¡Hola! Salut! Olá! Ciao! – Expanded language support
- Location, Location, Location – Flexible spawn-points
- Will the Real Virtual Humans Please Stand Up – Standing virtual humans
- Author Conversational Simulations in Which Learners Can Direct Their Attention to Multiple Virtual Humans
- The (Immersive Learning) Results Are In – Customizable results screen
¡Hola! Salut! Olá! Ciao!
Building on top of the available language options of German (Germany) and English (American), we are excited to provide learning content creators with 7 new language options for creating immersive learning content:
- Spanish (Mexico)
- Spanish (Spain)
- French (France)
- French (Canada)
- Portuguese (Brasil)
- Portuguese (Portugal)
- Italian (Italy)
Author content for more language and regions
CoPilot Designer is an English-first authoring tool. What does this mean? The user experience and user interface in CoPilot Designer for learning content creators is in English. However, using a new language selection drop-down menu, creators can select which language input a learner will use while consuming the learning content they create. The learner can complete the immersive learning experience in the selected language.
In CoPilot Designer v2.6, the text-to-speech capability is the most noticeable adjustment in terms of expanded language support. The virtual humans are now well-versed in 9 different languages. Depending on the language selected in the Project Configuration interface in CoPilot Designer, the virtual human will communicate in the determined language.
In immersive learning role play experiences, the emotional realism portrayed by virtual humans creates a strong sense of presence, which increases ‘knowing’ over ‘remembering. Mihaly Csikszentmihalyi discovered that people find genuine satisfaction during a state of consciousness called Flow. To achieve this state of Flow during a learning experience, the linguistic interactions of engaging in spoken dialogue with virtual human avatars is a crucial element.
These expanded language capabilities enable more learners to have immersive learning experiences in their native language. This will allow learners to enjoy XR content’s proven engagement as they practice and apply soft skills in their preferred language, and enable more organizations to take advantage of the benefits of VR soft skills training.
Note: the navigation of the Talespin App and getting into immersive content for the learner will be in English. The experience and interactions themselves will be in the predetermined language set in CoPilot Designer.
Location, Location, Location
Immersive soft skills learning simulations have emerged as a popular use case for the Metaverse, and virtual humans and virtual environments are critical components in the design of these experiences. Additional key considerations for immersive learning content design include deciding which virtual human most realistically matches the simulated conversation’s topic, what the appropriate attire would be for the particular conversation, what virtual environment the conversation should take place in, and where within a given virtual environment the user and virtual humans will be positioned.
CoPilot Designer gives content creators control over these decisions, with the available virtual environments offering various spawn point locations to utilize while designing an experience. Prior to version 2.6, these spawn locations were predetermined, with virtual humans and users being positioned in specific locations. Version 2.6 brings new flexibility, as users and virtual human avatars can be situated in any of the available spawn-point locations within a virtual environment, adding new options in the design of VR soft skills simulations.
Will the Real Virtual Humans Please Stand Up
The greater experience design flexibility provided by v2.6 doesn’t stop there. The addition of standing virtual human characters enables a new set of immersive learning use cases in which simulating standing conversations is a critical component in establishing the simulation’s realism.
For example, simulating the interaction between a cafe employee and customer, or a retail sales conversation that takes place in a storefront, are training use cases in which including standing virtual characters may be desired. In designing such a simulation, the learner can now be situated behind the cash register in a standing position and the virtual character can take on the role of a customer waiting in line to order their morning coffee. These new spawn point options help learning designers create more content for a wider range of use cases, and offer flexibility in terms of how those use cases are portrayed.
Author Conversational Simulations in Which Learners Can Direct Their Attention to Multiple Virtual Humans
Another new feature in v2.6 of CoPilot Designer is the ability for learners to determine which virtual character they are speaking to during the flow of a simulated conversation. As more organizations and individuals use the Metaverse for learning and collaboration, realistically simulating situations involving multiple virtual avatars will be critical.
Learning simulations created with CoPilot Designer now present learners with icons allowing them to select which virtual human avatar they would like to address when selecting a dialogue option. This user interface feature is only visible when there is more than one virtual human present within a learning module.
This new feature is valuable for helping learning designers to create learning simulations intended to include multiple virtual humans representing colleagues or customers. For example, meetings often involve a group of people, or a presentation on a business plan requires several team members to be present. In these scenarios, giving the learner the ability to speak to one virtual character, and then simply change their point of focus and engage another virtual character, makes for a more realistic learning experience similar to the way we navigate group conversations in the real world.
The (Immersive Learning) Results Are In
Accurate scoring for skills development and the ability to deliver real-time feedback to learners are two key benefits of immersive learning content. To ensure learners gain the full benefit and understanding from a learning experience, feedback is an important element. Version 2.6 of CoPilot Designer now gives learning content creators the ability to customize the feedback provided within the results screens learners see when they consume a particular lesson, or learning module created with CoPilot Designer.
Learners can now be presented with relevant information to help them understand and evaluate their current skill competencies, and identify areas for improvement and behavioral change. Creators can also ensure the information provided is relatable for the learner. Whether done by keeping the tone in mind, using organization-specific, or role-specific nomenclature and training methodologies, or any other key considerations, learners can now benefit from custom real-time feedback within an immersive learning experience. For further insights into skills development and learning journey progress, managers and L&D professionals can see this information in the Talespin Dashboard.
With the latest updates to CoPilot Designer, learning content creators are empowered with the ability to create a more diverse range of immersive learning experiences.
Quelle: