I’m trying to get perspective on this particular beauty standard and how I want to approach it. Do people whiten their teeth where you live? Is it seen as expected to do so? Do you live in a city?

I have healthy teeth that have nevertheless seen a lot of tea and coffee. I have generally thought of this as similar to wrinkles, i.e. a natural thing bodies do that I don’t want to pay money to fix since it isn’t broken. I still think this. But I have been feeling lately like there might be more actual social stigma to my teeth being discolored. I am wondering if this is at all real? Has whitening teeth become an expected thing for all adults to do now? I thought I’d ask how other people feel and think about this and what the general norm is in your social circle.

Edit: thanks for the responses everybody.

  • @[email protected]
    link
    fedilink
    English
    14 months ago

    I feel like this position devalues a lot of folk’s actions navigating dysphoria. It just so happens you find teeth whitening shallow but ultimately it’s someone trying to align their outward appearance to their mental self image.

    It’s not for us to judge the choices others make with their bodies, we can just hope they are healthy and happy.

    • @[email protected]
      link
      fedilink
      44 months ago

      I don’t think it’s shallow. I think it may reflect a psychological process that reveals insecurity.
      I do judge people on it initially, as I do according to all my prejudices. When I then get to know someone better, I adjust my views accordingly