• @[email protected]
    link
    fedilink
    4
    edit-2
    5 months ago

    That’s such an ignorant statement.

    It was based on human perception of temperature.

    0 being the coldest day measured and 100 the hottest. (As tested by other means)

    It’s a scale based on human perception and works with whole numbers, still.

    A fever of 100 vs 101 as opposed to a fever of 37.78 vs 38.3. (No, these are not fever thresholds, I’m using whole numbers as an example. Yes Fahrenheit also uses decimals. My point is graduation of measure)

    Metric & SI units may be better but you’re still wrong.

    • @SmoothOperator
      link
      65 months ago

      According to Wikipedia Fahrenheit is not based on that, but on the freezing point of brine (0 degrees F) and an approximation of average human body temperature (100 degrees F).

    • @[email protected]
      link
      fedilink
      English
      5
      edit-2
      5 months ago

      That’s such a stupid statement.

      1. You can still use decimals with Fahrenheit
      2. Fever is not measured with 2 decimal accuracy. Generally, 38 = mild fever, 39 = high fever, 40 = fucking high fever. Temperature meters still do show in 0.1°C accuracy because why the fuck not
      3. Edit: according to wikipedia normal human body temperature is 97.7–99.5 °F or 36.5 - 37.5 °C

      Also as someone mentioned below, it has nothing to do with hottest/coldest day recorded (Though that would be even worse)

      Several accounts of how he originally defined his scale exist, but the original paper suggests the lower defining point, 0 °F, was established as the freezing temperature of a solution of brine made from a mixture of water, ice, and ammonium chloride (a salt). The other limit established was his best estimate of the average human body temperature, originally set at 90 °F, then 96 °F (about 2.6 °F less than the modern value due to a later redefinition of the scale).