AT&T showed off some emerging mobile and cloud technologies this week at its Innovation Showcase event in New York City.

Researchers from AT&T Labs showed how voice, expressions and even brain waves can be interpreted by mobile devices. Some of the research could reach phones and tablets in a few years, while some might never see the light of day.

AT&T is offering APIs (application programming interfaces) to let developers explore how the technologies might be used and make contributions of their own.

Attached to this story are images of the research shown at the event.

Agam Shah covers PCs, tablets, servers, chips and semiconductors for IDG News Service. Follow Agam on Twitter at @agamsh. Agam's e-mail address is [email protected]