Subscribe Now: Feed Icon

Thoughts on whatever timely topic comes to mind.

August 09, 2010

Use Android + AI for “Personal Robots”?

By Doug Fiedor
August 9, 2010


Nao, a robot designed by researchers at the University of Hertfordshire, is said to be "able to display and detect emotions", The Telegraph in London reports.  Computer scientist Lola CaƱamero said, "Nao is able to detect human emotions through a series of non-verbal 'clues', such as body-language and facial expressions, and becomes more adept at reading a person's mood through prolonged interaction."  Watching the video at the site one sees that the 23 inch robot is rather cute -- and it’s almost a shame that they describe how it operates.


Anyway, they used a number of modern sensors that input to an artificial intelligence routine to detect the attitude and emotion of the human interacting with the robot and taught the robot responses.

Back when the Internet was nothing but a terminal hook-up between universities and researchers we played with a software package something like that called Eliza, and later Doctor.  In fact, in my physiology research laboratory, we actually dedicated an IBM PC specifically for an offshoot of Eliza that was programmed in a combination of C and Prolog.  I wrote the code and designed in a parsing system containing 5,000 common words for the natural language section.  However, we had an "open" research notebook for the project and welcomed suggestions from all of the researchers on the floor.

The unsuspecting user could actually have a conversation with that computer.  Some actually thought we were tricking them and there was a real person in another room answering.  It would often take a few minutes for a quick thinking human to trick the computer into an incorrect answer.

Alas, we soon reached the limits of our computing equipment and that (non-official) project ground to almost a halt due to the lack of memory and speed of the available PCs of the day.  The C components of the system ran quite fast.  The Prolog components, however, often ambled along at a less than desirable speed.

Today, things have changed greatly.  Our personal computers are blazingly fast compared to those old PCs.  So is most of the software.  Better yet, some of today’s operating systems are (almost) blazingly fast at searching and retrieving information from the Internet.

Here, I'm talking about Android.

Most people would not think of Android as being a speed demon, but my new HTC-EVO seems to do some things faster than my desktop -- especially when running Google applications.  Not only can it find and display information quickly, it also has a good speech to text and text to speech function.  Even better, the information necessary to run those routines are not part of my local storage and can therefore be run on my phone.  How cool is that!

Yeah, Android, I'm looking at you and wishing that I wasn't retired and still had that laboratory (and the funding) to play in!

Picture a natural language dictionary of 10,000 common words with all the parsing routines written.  Any of today's mid-price computers could handle that easily.  Add in our updated Eliza-Doctor software, along with all of the Google Android applications and a few other of the information gathering apps.

Hello HAL Jr.!

Could that be programmed into a robot?  Sure.  Limited versions of it is already in my EVO phone, so I'm sure there would be no problem inserting the proper hardware and software in a robot the size of Nao.

And there you have it: An information service that walks around and reacts to the emotion of the user.  Good, bad or indifferent, that project is doable right now.

Sure, I know, there’s little monitory benefit to constructing such a device.  But, that's what some said about the PC, too, back when I was a researcher.  There's no real profit margin in it, at the moment.  But, isn't that the direction robotics is going?  I think so.  So, it stands to reason that the first company that gets a package like that going is most probably the company that will show a profit from "personal robots" soonest.  And, from my point of view, I hope it's an American company. . . .

Yeah, I'm looking at you, Android.  I expect to see Android becoming a great little core system for the "thinking" and communication part of a personal robot.  All the communication and Internet handling routines are already perfected and reasonably fast, so it’s almost a plug-and-play system. 

So okay, some of the good computer scientists may find a few things to contradict here.  But, isn't the concept sound?

 

# # #

 

 

.

August 02, 2010

Restaurant Violates ADA Law?

Did you know that, if you play your cards right, you can get free money for being a bit inconvenienced because of a handicap?  Being what some call “disabled,” I knew a little about that, of course.  But I had forgotten some of it. 

Ben Conery, writing for The Washington Times, reminded me about that in his article “Chipotle in violation of disabilities act” this morning. 

Apparently there’s a paraplegic college professor out near San Diego who “has an extensive history of filing ADA-related lawsuits” and didn’t like how high the counters are at Chipotle restaurants because it’s a problem for him “choosing from among the ingredients lining the counter and watching staff assemble the meal” while sitting in a wheelchair.  Therefore, he wants the restaurant chain to remodel their business to conform to his special needs.  So, as reported:

“The 9th U.S. Circuit Court of Appeals in San Francisco ruled that two restaurants in San Diego violated the Americans with Disabilities Act (ADA) because the counters where the staff prepared tacos and burritos were too high and blocked the view for people in wheelchairs.”

Well now, let’s see how that works for me.  My neck and spine are frozen.  Yeah, all of it.  I can’t look up and I can’t look down.  Oh, and I cannot see all that well, either.

As many might know, most fast food joints have their menus high up, behind the counters.  See what I’m getting at yet?  I cannot see the menus that are placed high behind the counters.  Sometimes I can see the pictures, but in no case can I read the words – or the prices.  So, according to this whining professor, and the ever silly, often overturned, 9th U.S. Circuit Court of Appeals, I have been wronged!  Legally.

Yuppers, under the ADA I can sue any and all of these restaurants and not only get them to change the way they do business but also make them pay me money.  Cool, eh? 

Now, I can’t really say exactly where I would like them to place their menus so it would be comfortable for me.   But that doesn’t really matter.  I don’t have to redesign their facility to make me comfortable.  All I have to do is whine that the present design is unworkable for me.  And no, under the silly ADA laws, it also doesn’t matter that the current design is just fine for 98% of the population, either.  I can “get paid” anyway, simply because it’s difficult for me to choose an order in those places. 

Of course, I would never do such a thing.  But I’ve just been reminded that the current federal law is such that I could.  Better yet, the federal government would even provide me with a free lawyer, if I so desired.

.

About Me

My photo
Retired medical research scientist and clinical engineer and sometimes political campaign volunteer. Presently writing political commentary -- and starting to dabble in fiction. Interests include politics, alternative medicine, photography, and communications.