Post by DeNitro on Oct 12, 2016 9:35:18 GMT
As for what I'd like to point out is that AIs will likely NOT think the same way humans do. As DeNitro pointed out our brains evolved emotion much earlier than reason and so our reasoning always goes through emotions first. An AI designed from scratch won't have to have such mind pattern. In fact that would be rather inefficient way to construct an AI. So unless it's creators would be willing to sacrifice it's reasoning power to make it more human-like we probably won't be seeing AIs with true feelings and emotions.
However in DA setting as I understand it Minervan science is yet unable to build intelligence from scratch. And so Angeline's team more or less copied large part of the human brain architecture which necessitated copying the emotions part since they are closely wired to reasoning. This is a viable way of doing things called biomimicry. But as the science advances we'd expect to see a shift from human-like AIs to more logical intelligences without emotions, or at least not influenced by emotions with them being processed after the actual reasoning is done.
Sentience is a biological standard expected and projected by humans onto some ultimate AI. Consider what sentience is by our own definition, A ability to understand, experience, & display:
consciousness, personality, empathy, desire, will, ethics, humor, ambition, insight, and other human emotions. We assume anything with this ability is dangerous, because we know we are..




