CB2, the Robo-Boy prototype of the next generation


A BALD, child-like creature dangles its legs from a chair as its shoulders rise and fall with rythmic breathing and its black eyes follow movements across the room. It's not human - but it is paying attention.

Below the soft silicon skin of one of Japan's most sophisticated robots, processors record and evaluate information. The 130cm humanoid is designed to learn just like a human infant.

The creators of the Child-robot with Biomimetic Body, or CB2, say it is slowly developing social skills by interacting with humans and watching their facial expressions, mimicking a mother-baby relationship.

"Babies and infants have very, very limited programs. But they have room to learn more," said Osaka University professor Minoru Asada, as his team's 33kg invention kept its eyes glued to him.

The team is trying to teach the pint-sized android to think like a baby who evaluates its mother's countless facial expressions and "clusters" them into basic categories, such as happiness and sadness.

With 197 film-like pressure sensors under its light grey rubbery skin, CB2 can also recognise human touch, such as stroking of its head.

The robot can record emotional expressions using eye-cameras, then memorise and match them with physical sensations, and cluster them on its circuit boards, Prof Asada said.

Since CB2 was first presented to the world in 2007, it has taught itself how to walk with the aid of a human and can now move its body through a room quite smoothly, using 51 "muscles" driven by air pressure, he said.

In coming decades, Prof Asada expects science will come up with a "robo species" that has learning abilities somewhere between those of a human and other primate species such as the chimpanzee.

And he hopes that his little CB2 may lead the way, with the goal to have the robo-kid speaking in basic sentences within about two years, matching the intelligence of a two-year-old child.

By 2050, Prof Asada wants a robotic team of football players to be able take on the human World Cup champions - and win. Welcome to the cutting edge of robotics and artificial intelligence.

More than a decade since automaker Honda stunned the world with a walking humanoid P2, a forerunner to the popular ASIMO, robotics has come a long way.

Researchers across Japan have unveiled increasingly sophisticated robots with different functions - including a talking office receptionist, a security guard and even a primary school teacher.

Electronics giant Toshiba is developing a new model of domestic helper, AppriAttenda, which moves on wheels and can fetch containers from a refrigerator with its two arms.

Last month also saw the debut of Japan's first robotic fashion model, cybernetic human HRP-4C, which can strut a catwalk, smile and pout thanks to 42 motion motors programmed to mimic flesh-and-blood models.

A Tokyo subsidiary of Hello Kitty maker Sanrio, Kokoro - which means heart or mind in Japanese - has also produced advanced talking, life-size humanoids. "Robots have hearts," said Kokoro planning department manager Yuko Yokota.

"They don't look human unless we put souls in them. When manufacturing a robot, there comes a moment when light flickers in its eyes. That's when we know our work is done."

Views: 67


You need to be a member of Theosophy.Net to add comments!

Join Theosophy.Net

Comment by Mikhayl Von Riebon on April 10, 2009 at 12:23am
wow that is a really interesting question. because there is definitely more to suffering then pain, we have emotional pain, fear, depression, the sense of isolation and the idea that our identity, rights and views are denied, restricted or neglected, and there is also the need for freedom.

we could start to address all of these, for example:

What is physical pain? its something quite subjective, i.e. i can only tell someone else is in pain by understanding their physical responses compared to my own. i know what pain is and i am sympathetic to others, so i 'assume' they feel pain and move to help them eliminate pain. if we designed a robot without this sense of compassion we would be creating a sociopath. very much like the serial offenders that exist today, where that part of their reasoning ability is not functional and even warped to the point that they feel a positive response when people suffer. pain is important to us because it tells us when our body is not functioning properly or something is stopping it from doing so. all entities need this signal in order to better sustain their existence. one could even say that there may have been a species that didnt have this sense of self preservation and at some point its birth rate dropped below its death rate and it became extinct. dodo perhaps?

as for the latter forms of pain, i would imagine that these are closely related to the subject of 'I' and the ego. part of the ego's nature is to establish an identity, irrespective of what it may be, so long as it may function within the needs of the body. this identity is created through the relationships it perceives between things and then eventually between the idea of self and other. all actions it takes all likes and dislikes, all communication is designed from an 'I' perspective and to enforce the 'I' perspective. this 'I' software has shown to be highly beneficial to the survival of the body and so continues. the 'i' entity further grows by social entities, where it gains prestige and power over other 'i' entities. this has been found to be useful as higher prestige and power means higher survival ability.

sometimes an i entity, becomes unhealthy and either takes more then the social entity allows and is retaliated against, or not enough and becomes depressed and dies. so there is a threshold this i entity lives within. some social entities have higher and lower thresholds then others and so different cultures arise with differing levels of introversion and extroversion.

we can then imagine emotional pain might arise by the feeling of loss of a thing that an 'I' entity has used to identify itself with. like a loved one or a passionately hated one. even an object the i has used to identify. it would be natural to feel pain in this way because the construct of the 'I' is dependent on all relations around it, and if part of those relations change dramatically, it changes dramatically. i think we only feel pain though, dependent on the amount of time and energy put into a relationship. if you just pass someone on a street you have a relation, but there is a subconscious expectation that they will go and that is part of the relation with that person.

the fascinating thing is although the 'i' has no real proof that the other entities have 'i's as well (other than a turing test) it believes it does, much in the way it believes they feel pain. through similar reactions to things. thus it imbues objects with identities. sometimes out of necessity (otherwise it ends up like a socio path and is dealt with through justice system), sometimes out of companionship and self identity (like 'spaulding' the ball in tom hanks' castaway)

so to sum up i suppose these pains are all subject to the construct of the 'I' as if it were a physical building, and this 'i' relation tot he world around it. AI then is simply recreating this construct on an artificial platform. no wonder the buddha said all existence is suffering. if man wants to make a machine in his own image, he must give it the ability to feel pain.
Comment by Susan Thomas on April 9, 2009 at 10:05pm
We are here to put the mind and spirit into science. AI could do the physical work I do, but cannot interact with either the other AI or with humans the way I do. Even the most severely demented and compromised people I work with know when I have smiled at them, even when they have lost the ability to smile back. (One of the signs of end-stage dementia/Alzheimer's type) If, as the Buddha's followers have taugh all existance is suffering, it would be the depths of hell for me to have a robot's smile be the last smile I see on this blue marble.
Sentient beings are more than mind. If AI were ever considered sentient, then by buddhist logic, they would also suffer, no? What would be the response of AI to suffering?
Comment by Mikhayl Von Riebon on April 7, 2009 at 5:23am
about art of war - very fascinating there joe, although id ask what our current will is other than to expand as a cultural entity? im not really sure what western society's goal is at this point other then to expand, and from what i see of globalisation, any side's expansion is pointless as all economies are interdependent. i definitely agree with future technological prospects, ill be damned though if it becomes consumer based, though that's probably what will fuel it. imagine life extensions available only to the rich? thats why i could imagine money will become redundant. then definitely borg like. but i suppose whats wrong with that? we are all one anyway right? it would just be the ego, which we know is illusory anyway, that would resist it :D... actually this is really scary. resistance is futile!
Comment by Mikhayl Von Riebon on April 7, 2009 at 4:44am
i agree, infact i wonder what the majority of us would do, what purpose would we have in our lives without hard work? it provides us with a sense of self worth and accomplishment. i could imagine if we could transfer and expand our minds into silicon and experience virtual reality so real it was like reality, all that we would do is explore the infinite possibilities and variations of life. i mean who now wouldnt want to be a luke skywalker or captain picard, live there lives as if for a holiday? without work and with physical immortality we would probably turn our attention to the infnite lives that could be experiencing their struggles and hard work. of course we wouldnt be allowed to remember who we were while we were living these lives as other wise we would not truly experience the struggle. this would seem all too natural as well as we can already imagine that we are indeed doing this, instead as the eternal, experiencing the infinite possible manifestations.
Comment by Mikhayl Von Riebon on April 7, 2009 at 4:29am
the peculiar thing is they would eventually only be killing other robots. military pinpoint strategics has already shaped war to the point that 9 casualties is a bad war (thinking of the recent israel attack) compared to millions in WW2. in modern combat there's no need to kill the enemy when you can just take his weapons away, strip his economy and watch civil unrest develop.
Comment by Mikhayl Von Riebon on April 6, 2009 at 11:53pm
What do you think would happen to the work force if we were able to produce cheap AI labour? how would this affect your life?

Search Theosophy.Net!


What to do...

Join Theosophy.Net Blogs Forum Live Chat Invite Facebook Facebook Group

A New View of Theosophy


Theosophy References

Wiki Characteristics History Spirituality Esotericism Mysticism RotR ToS

Our Friends

© 2024   Created by Theosophy Network.   Powered by

Badges  |  Report an Issue  |  Terms of Service