Have you ever seen sex as purely physical, only to realize it’s something deeper?

I'm not asking about meaningless sex, but for those who don't view it that way—have you ever thought sex was just a physical act, only to realize it meant something much deeper? Maybe it felt spiritual, or it made you understand something profound about connection or acceptance.

What was that experience like for you, and how did it change your perspective?