Mon

18

FEB

2002

Speaking of Multimodal Interaction

Here’s another reason why it helps to be 1) thinking ahead and 2) shifting to XHTML:

“Web pages you can speak to and gesture at—W3C is developing standards for a new class of mobile devices that support multiple modes of interaction.”

The W3C’s new Multimodal Interaction Activity is extending the Web user interface to include more choices for interaction, including using one’s voice, a key pad, keyboard, mouse, stylus or other input device. The W3C’s new working group is developing new markup specifications for this. The public is welcome to join in the discussion: send an e-mail to www-multimodal-request@w3.org with the word subscribe in the subject. Or visit the the W3C discussion list archives, too.

Last fall I wrote here about the SALT Forum developing a royalty-free, platform independent standard (SALT = Speech Application Language Tags) to enable multimodal and telephony-enabled access to information, applications, and Web services from PCs, phones, tablet PCs, and PDAs.

For us developers, the good news is that we don’t need to learn another markup language. SALT will extend existing markup to include these new tags.

Exciting developments are on the horizon.

Addendum

SALT released its initial draft spec February 19th, the day after I wrote the above. See the SALT site and the article at VoiceXMLPlanet.com.

10:28 pm, pst18 February, 2002 Comments, Trackbacks ·';}?>

Categories: Design, Development, Standards

top

http://brainstormsandraves.com
*/ ?>