I’m trying to get perspective on this particular beauty standard and how I want to approach it. Do people whiten their teeth where you live? Is it seen as expected to do so? Do you live in a city?

I have healthy teeth that have nevertheless seen a lot of tea and coffee. I have generally thought of this as similar to wrinkles, i.e. a natural thing bodies do that I don’t want to pay money to fix since it isn’t broken. I still think this. But I have been feeling lately like there might be more actual social stigma to my teeth being discolored. I am wondering if this is at all real? Has whitening teeth become an expected thing for all adults to do now? I thought I’d ask how other people feel and think about this and what the general norm is in your social circle.

Edit: thanks for the responses everybody.

  • Tedrow
    link
    5
    edit-2
    2 months ago

    I think comparing whitening to bathing and using deodorant is calling it normal hygiene. Not bathing literally leads to worse health outcomes.

    That being said, you’re correct, I definitely have a strong bias towards this. I have been told by my dentist to not do it because it is damaging to the enamel. Consulting your dentist is definitely a good move.