Todak, I think @dimreepr means that when AI shows desires and motivations, as living creatures do, then we would more likely infer some type of conscious agency.
I myself am unsure. A self-preservation algorithm can be completely programmed, and implemented without any actual awareness or desire to keep living. The steam heat system in my aunt's house works to maintain a certain equilibrium so that it doesn't break, but few believe it's conscious. I am more likely to infer consciousness when an entity, to survive, reveals an ability to improvise novel methods of protection. Perhaps that makes me an intelligence chauvinist.
With some theories, like Tononi's, it's all a gradual continuum of awareness, with even single ants and flatworms having a dim awareness, and qualia. Epistemically, I believe we are forever blocked from knowing what it's like to be an ant. Consciousness is a process that is only known, experientially, from the inside. That's just how it is. To say that is almost tautological.