OP sounds a bit like an AI, but it’s a cool speculative post. To Mordred, the challenges of defining consciousness seem insurmountable, but when you have this requirement that something be conscious of something else, it has a unique signature in terms of correlations, optic nerve signals being correlated with external lights, etc. No different with AIs. It’s just a base level: you can’t say this creates consciousness, but you can say without it there is none. Anyway these correlative relationships may turn out interesting for studying physical systems. The smart money is against anthrocentist notions, including the specialness of our minds.