Subscribe Now: Feed Icon

Thoughts on whatever timely topic comes to mind.

August 09, 2010

Use Android + AI for “Personal Robots”?

By Doug Fiedor
August 9, 2010


Nao, a robot designed by researchers at the University of Hertfordshire, is said to be "able to display and detect emotions", The Telegraph in London reports.  Computer scientist Lola Cañamero said, "Nao is able to detect human emotions through a series of non-verbal 'clues', such as body-language and facial expressions, and becomes more adept at reading a person's mood through prolonged interaction."  Watching the video at the site one sees that the 23 inch robot is rather cute -- and it’s almost a shame that they describe how it operates.


Anyway, they used a number of modern sensors that input to an artificial intelligence routine to detect the attitude and emotion of the human interacting with the robot and taught the robot responses.

Back when the Internet was nothing but a terminal hook-up between universities and researchers we played with a software package something like that called Eliza, and later Doctor.  In fact, in my physiology research laboratory, we actually dedicated an IBM PC specifically for an offshoot of Eliza that was programmed in a combination of C and Prolog.  I wrote the code and designed in a parsing system containing 5,000 common words for the natural language section.  However, we had an "open" research notebook for the project and welcomed suggestions from all of the researchers on the floor.

The unsuspecting user could actually have a conversation with that computer.  Some actually thought we were tricking them and there was a real person in another room answering.  It would often take a few minutes for a quick thinking human to trick the computer into an incorrect answer.

Alas, we soon reached the limits of our computing equipment and that (non-official) project ground to almost a halt due to the lack of memory and speed of the available PCs of the day.  The C components of the system ran quite fast.  The Prolog components, however, often ambled along at a less than desirable speed.

Today, things have changed greatly.  Our personal computers are blazingly fast compared to those old PCs.  So is most of the software.  Better yet, some of today’s operating systems are (almost) blazingly fast at searching and retrieving information from the Internet.

Here, I'm talking about Android.

Most people would not think of Android as being a speed demon, but my new HTC-EVO seems to do some things faster than my desktop -- especially when running Google applications.  Not only can it find and display information quickly, it also has a good speech to text and text to speech function.  Even better, the information necessary to run those routines are not part of my local storage and can therefore be run on my phone.  How cool is that!

Yeah, Android, I'm looking at you and wishing that I wasn't retired and still had that laboratory (and the funding) to play in!

Picture a natural language dictionary of 10,000 common words with all the parsing routines written.  Any of today's mid-price computers could handle that easily.  Add in our updated Eliza-Doctor software, along with all of the Google Android applications and a few other of the information gathering apps.

Hello HAL Jr.!

Could that be programmed into a robot?  Sure.  Limited versions of it is already in my EVO phone, so I'm sure there would be no problem inserting the proper hardware and software in a robot the size of Nao.

And there you have it: An information service that walks around and reacts to the emotion of the user.  Good, bad or indifferent, that project is doable right now.

Sure, I know, there’s little monitory benefit to constructing such a device.  But, that's what some said about the PC, too, back when I was a researcher.  There's no real profit margin in it, at the moment.  But, isn't that the direction robotics is going?  I think so.  So, it stands to reason that the first company that gets a package like that going is most probably the company that will show a profit from "personal robots" soonest.  And, from my point of view, I hope it's an American company. . . .

Yeah, I'm looking at you, Android.  I expect to see Android becoming a great little core system for the "thinking" and communication part of a personal robot.  All the communication and Internet handling routines are already perfected and reasonably fast, so it’s almost a plug-and-play system. 

So okay, some of the good computer scientists may find a few things to contradict here.  But, isn't the concept sound?

 

# # #

 

 

.

2 comments:

Anonymous said...

I don't think the concept is sound.
Reason: Propensity of mankind to weaponize anything and everything in order wage ever more devastating wars, not with the goal of winning
but with the goal of exacerbating and expanding wars for war's sake.

When 'creators' can build AI/robotics that will not interface to weaponize no matter what, they will have a sound concept. Till then, no.

ManicScribbler said...

Hi Doug,
I found your article (and the comment from Anonymous) particularly interesting as am in the process of writing a collection of short SciFi stories on this very subject, with my son (an intelligent and imaginative computer scientist).
Hope to read more.
Regards,
Lyn

About Me

My photo
Retired medical research scientist and clinical engineer and sometimes political campaign volunteer. Presently writing political commentary -- and starting to dabble in fiction. Interests include politics, alternative medicine, photography, and communications.