Hide metadata

dc.date.accessioned2013-03-12T08:14:34Z
dc.date.available2013-03-12T08:14:34Z
dc.date.issued2006en_US
dc.date.submitted2006-03-23en_US
dc.identifier.citationPaus, Aleksander. Adaptive facial behaviour using selected methods in machine learning and motor control. Masteroppgave, University of Oslo, 2006en_US
dc.identifier.urihttp://hdl.handle.net/10852/9418
dc.description.abstractIn the not so distant future, androids may be a part of our everyday lifestyle. There can be a desire not only to make these androids do our work, but also to enable communication with humans in a natural way. As a substantial part of human communication is through or body language, natural mimic is essential if artificial communication is to seem real. Even though this might appear like a trivial problem, there are a lot of obstacles to solve. To achieve physical human appearance, we have to develop artificial skin that looks and folds naturally. This is far from an easy task, as living tissue has totally different characteristics from synthetic materials. Secondly we need some form of artificial actuators, preferably situated in or behind the synthetic skin. Human muscles have some amazing properties that we are not able to mach yet. They are silent, strong, flexible, and precise, and last for millions of cycles. At last there is a need for a sensory system of some sort. Humans have an extremely advanced feedback system, providing information about factors such as pressure, temperature, pain etc. This feedback enables humans to make advanced decisions about their surroundings, and adjust their appearance accordingly. Even though all these factors were in place, natural mimic wouldn’t be achieved before the android developed adaptive behaviour of its face expressions. A robot could of course be pre programmed with a fixed set of face expressions, but this would undoubtedly restrict the personification of such an android. In order to achieve natural mimic and the impression of personality, it’s essential to make the android’s face expressions adaptive and to make the android learn and create face expressions never shown before. This master thesis addresses some parts of the problem of achieving adaptive facial behaviour, essential for making androids operate in social settings, and obtaining natural communication with humans.nor
dc.language.isoengen_US
dc.titleAdaptive facial behaviour using selected methods in machine learning and motor controlen_US
dc.typeMaster thesisen_US
dc.date.updated2006-04-24en_US
dc.creator.authorPaus, Aleksanderen_US
dc.subject.nsiVDP::420en_US
dc.identifier.bibliographiccitationinfo:ofi/fmt:kev:mtx:ctx&ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft.au=Paus, Aleksander&rft.title=Adaptive facial behaviour using selected methods in machine learning and motor control&rft.inst=University of Oslo&rft.date=2006&rft.degree=Masteroppgaveen_US
dc.identifier.urnURN:NBN:no-12146en_US
dc.type.documentMasteroppgaveen_US
dc.identifier.duo37522en_US
dc.contributor.supervisorMats Høvinen_US
dc.identifier.bibsys060686782en_US
dc.identifier.fulltextFulltext https://www.duo.uio.no/bitstream/handle/10852/9418/1/Masterthesis.pdf


Files in this item

Appears in the following Collection

Hide metadata