Why do women not get taken seriously?
This is something I notice a lot in day to day life as well, but I experience it all day everyday at work lately. I (F) work with men, it's not what I usually do but I help them out since they are understaffed. Since I work in the background I know way more about how things work then they all do combined. However when we work with customers, no matter the gender, they almost always demand to talk to one of the men. There's an issue and I explain it to the customer, they yell at me and don't believe my word and ask for one of the men to handle it, just for them to tell the customer the exact same thing. They accept what the men say but not what I say? Why is that? I give them a way more informed insight and explanation and actually deal with the issue but they always need a man to confirm? Have you encountered things like that? How are you feeling about this? Do people just assume women can't do as well of a job?
*Edit because it has been brought up, I'm not blaming sexism, I'm genuinely curious if it's something people, when being the customer themselves, ever think about, if it's a conscious choice