Sponsors of NRW in Second Life

April 17, 2011 Leave a comment

Thank Our Sponsors

If you enjoyed this week’s events, please take a moment to thank our sponsors!

Diamond Sponsors

  • Avvenimenti Iblei magazine – Asia Connell
  • Private Donation – Maccus McCullough
  • RM Ellingson Design & Development LLC – rogere Resident

Gold Sponsors

  • Archivopedia LLC – Archivist Llewellyn
  • Private Donation – Universa Vanalten

Silver Sponsors

  • Confederated Response Force – patrick Thorkveld
  • Sonalysts Studios – Joey Aboma
  • National Space Society – Ariel Miranda
  • Anya’s Enterprise – Anya Heberle

Bronze Sponsors

  • Yoske Analytics, Inc. – Mobius Overdrive & Ainsley Fizir
Categories: Uncategorized

Chinese Language and Cultural Training Bots

April 16, 2011 Leave a comment
Chinese Island Restaurant

Chinese Island Restaurant

Xilin Yifu talked about the Chinese Studies Program at Monash University that has taken existing concepts and integrated them into task-based language and culture lessons. They are working on PhD thesis on teaching language and culture in virtual worlds. The working title is “Getting Immersed in Chinese.”

The role of non-player characters (bots) in task-based language and culture learning

Automated non-player characters (NPCs) are common to many computer / video games and many digital learning environments, performing both a ‘scaffolding’ function and an interactive function.  Bots are also common in VWs like Second Life, both of the prim-bot kind and the avatar-bot kind and with varying levels of functionality.  Chatbots are likewise common in many 2D and 3D environments, and while providing some degree of interactivity, it is often limited, unfocussed and of a circular nature.

The Chinese Studies Program at Monash University has taken some of these existing concepts and integrated them into task-based language and culture lessons. A system of avatar-bot NPCs with a range of standard SL functions (give, rez objects, move) has been developed in combination with an AIML chatbot database tailored to the needs of the Chinese language and of the pedagogical goals of the lessons. One unique feature of this combination is actions/functions that are activated from within the AIML program itself, allowing for these actions or functions to occur as a natural part of the dialogues between learner and NPC. In addition, the NPCs are also able to interact with a number of other ‘tools’ developed within the Second Life environment especially for our lessons. Finally, a method for logging learner and NPC interaction has been developed for post-lesson review and data gathering.

These NPCs perform a number of key roles in terms of classroom management (especially with large numbers of learners inword at the same time), providing learners with focused ‘naturalistic’ linguistic and cultural interaction, providing them with opportunities for ‘meaningful’ communication, acting as gatekeepers for key stages of a particular task, providing learners with key artefacts required to complete the set task, and providing scaffolding technically and with lesson content.

The above features (technical and pedagogical), as well as the advantages and disadvantages of using such a system, will be discussed in detail during the presentation. Examples from actual lessons will be offered to illustrate how the NPC system is used in practice, with an exemplar machinima of one particular lesson to be shown.

Web: http://www.virtualhanyu.com

When: Saturday, April 16, 2011 @ 5pm (SLT/PDT)

Where: http://slurl.com/secondlife/Monash%20University%202/201/166/26

Xiaohong Fang helps SunTzu

Xiaohong Fang helps SunTzu

Bio

Scott Grant [Xilin Yifu] is a graduate of Monash University with Bachelor of Economics and a Master of Translation Studies degrees. He is currently working on his PhD thesis on teaching language and culture in virtual worlds. The working title is “Getting Immersed in Chinese”.

Prior to commencing his teaching and academic career at Monash University, Scott spent 7 years living, studying and working in China.

Kaylee (xilin.yifu)

Kaylee (xilin.yifu)

Scott has taught Chinese language and culture at tertiary level for more than ten years. He also taught translation (Chinese<>English) for a number of years at post-graduate level and is a professionally qualified translator. He has been developing and implement language and culture lessons and a Chinese-themed learning environment in Second Life for the last two and a half years.

Scott has developed a Chinese-themed virtual leaning environment in Second Life on Chinese Island at Monash University 2. In line with constructivist / social-constructivist education principles, this environment has been purpose designed for the learning of Chinese language and culture both synchronously and asynchronously. In language acquisition terms, the environment constitutes a ‘rich mediated interactive environment’ that together with the purpose-designed lessons, lesson content and pedagogy are integrated into the existing formal undergraduate curriculum. The virtual environment & associated lessons are also closely integrated with Moodle, both directly via Sloodle and in parallel. The ‘environment’ also includes a sophisticated non-player character system that has been purpose-designed and developed and used in a range of lessons to date. Both the general environment and the NPC system are under ongoing development and collaboration and suggestions warmly welcomed.

Categories: Uncategorized

Voice Controlled SL Robots

April 16, 2011 Leave a comment
Quincy Dagger talks

Quincy Dagger talks

Controlling the bot

Controlling the bot

Quincey Dagger talks about how to make your Plain Old Telephone Service (POTS) or cell phone into a voice controller for your virtual world robots.  Using your landline or cell phone, your voice can control your robots hands free, while allowing you to use the SecondLife Viewer for text chatting.  The tools you will learn about are FreeSwitch, an open source voice switch for voice-over-IP (VoIP), and PocketSphinx, an open source continuous speech recognition system.

Time
: Saturday, April 16, 2011 @ 3 PM (SLT/PDT)
SLurl: http://slurl.com/secondlife/IEEE%202/56/162/27

Categories: Uncategorized

Exploring the Caverns of the Moon

April 16, 2011 Leave a comment
Sine Arrow

Sine Arrow

Abstract: Sine Arrow and Marcus MacMoragh are members of the Oregon L5 Society, a chapter of the National Space Society nonprofit, who are preparing a “First Life” NASA proposal and would like your feedback! They are modeling their robotic “moonbat” in Second Life to explore caves on the moon.

Time: Saturday, April 16, 2011 @ 2 PM (SLT/PDT)
SLurl: http://slurl.com/secondlife/National%20Space%20Society/65/171/301
Websites:

Categories: Uncategorized

AI Assisted Industry Training and Simulation Applications

April 16, 2011 Leave a comment
Warbird Cyberstar

Warbird Cyberstar

Warbird Cyberstar  presents this talk about AI Assisted Industry Training within an Interactive 3D Applications Simulation. As an example, I will be demonstrating a three-dimensional Interactive 3D application simulation of a Standby Monitor Regulator Station from the Natural Gas industry. This simulation is designed for the technician to perform an operational inspection of a regulator station. Because of the versatility of the simulator the user is also able to perform other routine functions and cause and affect scenarios at ones discretion; such as closing a valve on a sensing line and stroking the primary regulators.  I have also developed an AI engine built within this simulation that essentially is an expert system that fundamentally understands what is going on and happening with the technician in real-time. Included are 3 modes of training built within this simulation containing a Virtual Tour, Training (AI) and Testing mode.

Within the training mode of this simulation (where the AI is located) it features real-time tracking of user actions and feedback utilizing the capabilities innate to its built-in expert system. In this way, the simulation provides functionality of a intelligent Virtual Instructor which is personified by a 3D avatar which follows the technician step-by-step as he or she manipulates the system. This form and function of our Artificial Intelligence is certainly applicable across a very wide-range of training situations, but what I like most about this type of training is that the intelligent instructor never grows bored and has your complete and full attention throughout the training process. The built-in AI prompts the user on what procedure or action is required at any given time. If the trainee performs an incorrect action, the AI agent notifies the trainee immediately and will “undo” that incorrect action to keep the trainee “on track” and then the intelligent instructor automatically repositions the simulation back to its previous state so the trainee can again try to perform the correct procedure or action. The instructor also provides a randomization feature which generates changes to the training procedure by randomly selecting different elements that are faulty and need repair. This keeps the training fresh and introduces a degree of uncertainty which forces the trainee to pay complete attention at all times.

Also included is an AI feature that allows the trainee to ask the instructor to provide assistance in locating gauges which are leaking and/or defective. This assistance is provided by an intelligent 3D avatar that accompanies the trainee throughout this simulation. If a trainee is unable to locate a leaking or faulty gauge and requires help, the AI instructor is intelligent enough to show that trainee exactly where these gauges are located. Not only will the avatar provide assistance at the user’s request, but also demonstrates the shortest possible traveling route to locate these gauges. I also have more adaptive programming that works along side of the regulator monitors gauges and valves that can and does effect pressuring in real time. The user can also stroke the primary regulators with a wrench and perform all the “cause and effect” scenarios at his or hers discretion, and even blow up a house down at the end of the line.

I do believe that these AI features makes this type of computer-based training more engaging, while it also simultaneously delivers greater learning retention levels for its end-users. This is a great foundational training tool and I expect that practicing on this simulator will give the user a sound foundation onto which to build upon; and to go a long way towards improving safety in the natural gas industry. The project is good example of AI assisted training within a Interactive 3D application simulation for commercial applications of Artificial Intelligence in business and industry.

VyperSim

VyperSim

Kevin Simkin’s Bio: As a committee member for the Electric Association Advisory for the Midwest Energy Association, Kevin Simkins serves as subject matter expert and liaison between virtual worlds and the energy industry. He conducts research and development into new algorithms for 3D virtual worlds as well as interactive 3D application simulations and collaboration tools. Simkins is the Ontology/Taxonomy Lead for the IEEE VW Standard Working Group. As the only individual ever to have three winning entries in the Federal Virtual Worlds Challenge, Simkins has distinguished himself as a competitive leader in the fields of virtual worlds and artificial intelligence.

Time: Saturday April 16, 2011 @ 9 AM (SLT/PDT)

SLurl: http://slurl.com/secondlife/IEEE%202/56/162/27

VyperSim: http://vypersim.com/VyperSim_Home.html

Categories: Uncategorized

Basic AIML authoring for artificial intelligent agents and chat bots

April 15, 2011 Leave a comment

Archivist Llewellyn and Joey Aboma provided a class on basic AIML authoring for artificial intelligent agents and chat bots.  This is the first class on this topic at the Artificial Intelligence Learning Center (AILC) to help others learn the basics of bringing this to life in these virtual environments.

Archivist and Joey

Archivist and Joey

SLIDES: http://www.slideshare.net/01archivist/basic-aiml-class
LOCATIONL: http://slurl.com/secondlife/IEEE%202/57/162/27

Categories: Uncategorized

Penn State’s Educational Robotics Projects and Exhibits

April 13, 2011 Leave a comment

Penn State Abington Robotics NRW Tour (4-13-11)
Plato Pizzicato

Plato Pizzicato takes us on a tour of Penn State Abington’s educational robotics projects and exhibits in Second Life (SL).  Includes discussion of Real Life (RL) robot contests, outreach, and plans to bridge RL and SL events.

Robots In Auto Industry

Robots In Auto Industry

1. Penn State hosts annual RL competitions in the Philadelphia PA.
The robot contests are open to K-12, college, and beyond.  The activities support outreach, curriculum enhancement, and undergraduate research.   Check our websites:

2. Robot Maze Exhibit
Users can create a notecard containing robot commands to allow a robot to move through a maze.  The goal is to educate students and the public on introductory robotics and also to demonstrate the usefulness of virtual worlds to host interactive learning tools which are acessible to a global audience.

3. Virtual Living Space to Model Human-Robot Interactions
Virtual living room area and kitchen to serve as a virtual platform to study the behavior of robots in a living space.  Roomba robots (floor cleaning robots)  have been programmed to navigate in the space.

4. Penn State Abingotn – Hof (Germany)  Robot Workshop Exhibit Space.
In March 2010, a team of PSU Abington studnets and faculty traveled to Hof Univeristy in Germany to particpate in a week-lon robotics workshop.  One of the modules included using Second Life to create exhibits of a variety of robotics application areas such as collliion avoidance, robots in manufacturing and security robots.  Teams comprising Penn State and German students were able to research and create virtual robotics exhibits in SL in a single day.   A public poster session held in SL, and attracting audience members from around the globe,  was completed at the end of the course module.

5.  3D Simulation of a RL Smart Home
We have begun contruction of a virtual replica of a RL campus smart home (consisting of a living roomm and kitchen area).  The smart home will be equipped with web cameras and people tracking algorithms to assess the overall health and quality of living for aging-in-place studies.  The SL model will allow us to experiment with rearranging furniture and appliances to study the impact on navigation and camera coverage.  We also plan to achieve communication and interactions between the RL and SL facilities to improve assessment, monitoring, and visualization  of living quality.

Date & Time: Wednesday (14 Apr 2011) at 4 PM (SLT/PDT)
Location: http://slurl.com/secondlife/Penn%20State%20Isle%202/121/21/27

Categories: Uncategorized