Could it be both? Like this: we perceive something, perhaps slight changes in someone's facial expression, perhaps the way someone fidgets, whatever, in any case it's very subtle, blink and you'll miss it. This gives us information on the other person's state of mind. But we can only decode this information into the framework of our own emotions. And that's were the errors come in.

Couldn't it be a bit like the way we perceive information through our senses? That is highly influenced by the categories we've formed in our mind (regardless of type). We never perceive everything. What we generally do is this, we perceive certain key stimuli, and these trigger the categorization of the entire object. So if you see a dog, for example, your brain doesn't process the entire wealth of information your senses send it. It has a holistic and "archetypal" image of a dog, and if there are enough bits of information to trigger this image, then that's it, task done, next one please. If you ever wondered why toddlers spend HOURS staring at a doggie (at everything really), that's why. They haven't really got very well-defined categories, and that is why they need to examine everything over and over and over again. And of course that cookie cutter way of perceiving reality explains why we so often get it wrong... for example, why we can mistake a coat and a pair of boots for a man standing in a dark corner.

Ach, this has nothing whatever to do with the subject of this thread. I was trying to see parallels to the way we "perceive emotions"... but perhaps there are none. Anyway, it was something I feel strongly about, so that should satisfy those of us who've got strong