I’m trying to get perspective on this particular beauty standard and how I want to approach it. Do people whiten their teeth where you live? Is it seen as expected to do so? Do you live in a city?

I have healthy teeth that have nevertheless seen a lot of tea and coffee. I have generally thought of this as similar to wrinkles, i.e. a natural thing bodies do that I don’t want to pay money to fix since it isn’t broken. I still think this. But I have been feeling lately like there might be more actual social stigma to my teeth being discolored. I am wondering if this is at all real? Has whitening teeth become an expected thing for all adults to do now? I thought I’d ask how other people feel and think about this and what the general norm is in your social circle.

Edit: thanks for the responses everybody.

  • @Cuttlefish1111
    link
    7
    edit-2
    1 month ago

    I had my teeth whitened professionally 3 times and it was only on the third time I was told some people have naturally yellow teeth and there’s nothing that can be done.

    Most people don’t care

    • @[email protected]
      link
      fedilink
      English
      11 month ago

      I was wondering about that, since my teeth are kinda yellow.

      So it doesn’t help to have them whitened?

      • @Cuttlefish1111
        link
        11 month ago

        It’s best practice to just keep brushing regularly and go to the dentist to have them cleaned once a year.

        Whitening for some people is useless