Discussion:
Is there a scientific definition of Fahrenheit, or not?
(too old to reply)
DWIII
2006-07-07 15:49:49 UTC
Permalink
This is some minor issue that I have been wrestling with on my own for
some time now, and intensive web/newsgroup searching has come up empty.
Firstly I will go over the basics as I understand them.

Both Fahrenheit and Celsius, after some historical numerological
juggling, eventually settled their respective scales on two fixed
points: the melting point (0 degrees C; 32 degrees F) and the boiling
point (100 degrees C; 212 degrees F) of water. Exactly under what
environmental conditions, purity of water sample, etc, which these
points were practically realized is a fine detail, but not salient to
my point here.

Later, with the adoption of the Kelvin scale by SI, whose two fixed
points are taken to be absolute zero and the triple point of water,
separated by exactly 273.16 divisions, the Celsius scale was redefined
(for scientific purposes) in terms of kelvins to wit: 0.01 degrees
Celsius is now identified with the triple point at 273.16 K, and the
size of the degree Celsius is defined to be exactly equal to the Kelvin
(http://physics.nist.gov/cuu/Units/kelvin.html). Note that this
redefinition does away with the former Celsius fixed points (0, 100).

Of course, for informal everyday purposes, this redefinition makes no
practical difference; that is why the value "273.16" was chosen in the
first place: backwards compatibility. 0 degrees C and 100 degrees C
are still, as before, the respective melting and boiling points of
water within acceptable margins of error (let's say, +/- 0.02 degrees
for the boiling point (99.975 degrees C according to
http://www.lenntech.com/unit-conversion-calculator/temperature.htm)).
However, for purposes of metrology, it can no longer be claimed that
0/100 are the exact Celcius values of the melting/boiling points.

So, where does this leave Fahrenheit, as it is currently practiced in
my country (the archaic and scientifically backwards United States)?
Virtually every source I have consulted brazenly states the usual
simple (and exact) mathematical conversion between C and F (we are all
quite familiar with that), and even go so far as saying that that
well-known conversion is strictly based on the 0/32 and 100/212
melting/boiling points. However, these points are are no longer
applicable (in particular, no longer applicable to Celsius). As I said
previously, this state of affairs is fine and dandy for practical
everyday stuff (e.g. weather reports), but not for an amateur
metrologist like me.

The two obvious alternatives are (1) Fahrenheit has, in fact, been
redefined in terms of either Kelvin, or by way of the new Celsius
definition (which amounts to the same thing), thereby detatching it
from its former fixed points while I wasn't looking, and thus the usual
conversion formula is exact, as it was formerly, or (2) Fahrenheit
still retains its old fixed points of 32/212 because it has never been
officially redefined as such (to my knowledge), and thus there now
exists no exact conversion between C and F as there used to be, except
by way of modern measurement with an unavoidable margin of error.

So is it (1) or (2)? Or can we say that (3) there now exists two
separate Celsius scales in common usage: one exactly convertible to F
alone (in the US), and the other exactly convertible to K alone (in the
rest of the world)?

Also note that Russ Rowlett
(http://www.unc.edu/~rowlett/units/dictK.html) blandly states under his
entry for "kelvin (K)": "The kelvin equals exactly 1.8 degrees
Fahrenheit", which is entirely unacceptable in light of his definition
for "degree Fahrenheit (°F)": "... On this scale, the freezing point
of water (at normal sea level atmospheric pressure) turned out to be
about 32 °F and the boiling point about 212 °F. Eventually the scale
was precisely defined by these two temperatures."

I can't quite recall if NIST had ever explicitly said the Fahrenheit
was not, or no longer an acceptable unit for use with SI, which I'm
almost quite sure it isn't anyway. They don't even bother to mention
Fahrenheit on their current list of deprecated units
(http://physics.nist.gov/cuu/Units/outside.html), one way or the other,
which suggests that it was chucked outright while I wasn't looking.
And no, I don't suspect a conspiracy, either.

I'll bet Mr Nygaard knows what's going on here! :-)

DWIII
Andy Resnick
2006-07-07 17:15:33 UTC
Permalink
Post by DWIII
This is some minor issue that I have been wrestling with on my own for
some time now, and intensive web/newsgroup searching has come up empty.
Firstly I will go over the basics as I understand them.
<snip>

I'm not really sure what you are asking- First off, historically there
were many competing scales for temperature: Newton, Romer, Delisle,
Celsius, and Fahrenheit (among others). All used various
freezing/melting/boiling points of various forms of water, with varying
numbers of divisions between the two. Why Celsius and Fahrenheit's
became more widely used than anyone else's, I cannot say.

Kelvin and Rankine scales are absolute scales- they are thermodynamic
and have a proper conceptual foundation. The scales above, prior to
1860 or so were not, strictly speaking, well-defined measures of a
system (statistical mechanics provided the needed conceptual link
between energy and heat).

So I guess just as you reconcile the Kelin and Celsius scales, so you
can reconcile the Rankine and Fahrenheit scales. And Kelvin and Rankine
scales differ only in the energy interval corresponding to '1 degree'.
--
Andrew Resnick, Ph.D.
Department of Physiology and Biophysics
Case Western Reserve University
Randy Poe
2006-07-07 17:32:12 UTC
Permalink
Post by Andy Resnick
Post by DWIII
This is some minor issue that I have been wrestling with on my own for
some time now, and intensive web/newsgroup searching has come up empty.
Firstly I will go over the basics as I understand them.
<snip>
I'm not really sure what you are asking- First off, historically there
were many competing scales for temperature: Newton, Romer, Delisle,
Celsius, and Fahrenheit (among others). All used various
freezing/melting/boiling points of various forms of water, with varying
numbers of divisions between the two. Why Celsius and Fahrenheit's
became more widely used than anyone else's, I cannot say.
Kelvin and Rankine scales are absolute scales- they are thermodynamic
and have a proper conceptual foundation. The scales above, prior to
1860 or so were not, strictly speaking, well-defined measures of a
system (statistical mechanics provided the needed conceptual link
between energy and heat).
So I guess just as you reconcile the Kelin and Celsius scales, so you
can reconcile the Rankine and Fahrenheit scales. And Kelvin and Rankine
scales differ only in the energy interval corresponding to '1 degree'.
I think he's asking about that interval. Is a Kelvin exactly 1.8
degree Fahrenheit, or something slightly different? If
Fahrenheit is still defined as a scale where water freezes
a 32 deg and boils at 212 deg, but on the Celsius scale
those points are no longer 0 and 100, then the conversion
factor is no longer 1.8.

- Randy
z***@netscape.net
2006-07-07 18:43:25 UTC
Permalink
Post by Randy Poe
Post by Andy Resnick
Post by DWIII
This is some minor issue that I have been wrestling with on my own for
some time now, and intensive web/newsgroup searching has come up empty.
Firstly I will go over the basics as I understand them.
<snip>
I'm not really sure what you are asking- First off, historically there
were many competing scales for temperature: Newton, Romer, Delisle,
Celsius, and Fahrenheit (among others). All used various
freezing/melting/boiling points of various forms of water, with varying
numbers of divisions between the two. Why Celsius and Fahrenheit's
became more widely used than anyone else's, I cannot say.
Kelvin and Rankine scales are absolute scales- they are thermodynamic
and have a proper conceptual foundation. The scales above, prior to
1860 or so were not, strictly speaking, well-defined measures of a
system (statistical mechanics provided the needed conceptual link
between energy and heat).
So I guess just as you reconcile the Kelin and Celsius scales, so you
can reconcile the Rankine and Fahrenheit scales. And Kelvin and Rankine
scales differ only in the energy interval corresponding to '1 degree'.
I think he's asking about that interval. Is a Kelvin exactly 1.8
degree Fahrenheit, or something slightly different? If
Fahrenheit is still defined as a scale where water freezes
a 32 deg and boils at 212 deg, but on the Celsius scale
those points are no longer 0 and 100, then the conversion
factor is no longer 1.8.
Nobosy will ever know, Since a Kelvin is defined
in terms of the triple point of H20.
So to define Kelvin, you have to defne water,
rather than temperature or degrees.
Post by Randy Poe
- Randy
m***@cars3.uchicago.edu
2006-07-07 19:30:44 UTC
Permalink
Post by Randy Poe
Post by Andy Resnick
Post by DWIII
This is some minor issue that I have been wrestling with on my own for
some time now, and intensive web/newsgroup searching has come up empty.
Firstly I will go over the basics as I understand them.
<snip>
I'm not really sure what you are asking- First off, historically there
were many competing scales for temperature: Newton, Romer, Delisle,
Celsius, and Fahrenheit (among others). All used various
freezing/melting/boiling points of various forms of water, with varying
numbers of divisions between the two. Why Celsius and Fahrenheit's
became more widely used than anyone else's, I cannot say.
Kelvin and Rankine scales are absolute scales- they are thermodynamic
and have a proper conceptual foundation. The scales above, prior to
1860 or so were not, strictly speaking, well-defined measures of a
system (statistical mechanics provided the needed conceptual link
between energy and heat).
So I guess just as you reconcile the Kelin and Celsius scales, so you
can reconcile the Rankine and Fahrenheit scales. And Kelvin and Rankine
scales differ only in the energy interval corresponding to '1 degree'.
I think he's asking about that interval. Is a Kelvin exactly 1.8
degree Fahrenheit, or something slightly different? If
Fahrenheit is still defined as a scale where water freezes
a 32 deg and boils at 212 deg, but on the Celsius scale
those points are no longer 0 and 100, then the conversion
factor is no longer 1.8.
As I recall, Fahrenheit is currently defined in terms of Celsius
(thus, indirectly, of Kelvin), just as the inch is defined in terms of
meter. If somebody is interested enough, I'm sure NIST can provide
answers.

Mati Meron | "When you argue with a fool,
***@cars.uchicago.edu | chances are he is doing just the same"
DWIII
2006-07-08 03:46:48 UTC
Permalink
Post by m***@cars3.uchicago.edu
Post by Randy Poe
Post by Andy Resnick
Post by DWIII
This is some minor issue that I have been wrestling with on my own for
some time now, and intensive web/newsgroup searching has come up empty.
Firstly I will go over the basics as I understand them.
<snip>
I'm not really sure what you are asking- First off, historically there
were many competing scales for temperature: Newton, Romer, Delisle,
Celsius, and Fahrenheit (among others). All used various
freezing/melting/boiling points of various forms of water, with varying
numbers of divisions between the two. Why Celsius and Fahrenheit's
became more widely used than anyone else's, I cannot say.
Kelvin and Rankine scales are absolute scales- they are thermodynamic
and have a proper conceptual foundation. The scales above, prior to
1860 or so were not, strictly speaking, well-defined measures of a
system (statistical mechanics provided the needed conceptual link
between energy and heat).
So I guess just as you reconcile the Kelin and Celsius scales, so you
can reconcile the Rankine and Fahrenheit scales. And Kelvin and Rankine
scales differ only in the energy interval corresponding to '1 degree'.
I think he's asking about that interval. Is a Kelvin exactly 1.8
degree Fahrenheit, or something slightly different? If
Fahrenheit is still defined as a scale where water freezes
a 32 deg and boils at 212 deg, but on the Celsius scale
those points are no longer 0 and 100, then the conversion
factor is no longer 1.8.
As I recall, Fahrenheit is currently defined in terms of Celsius
(thus, indirectly, of Kelvin), just as the inch is defined in terms of
meter. If somebody is interested enough, I'm sure NIST can provide
answers.
Fahrenheit may have been redefined (as you say) by being explicitly
pegged to Celsius at some point, but I still cannot find any reference
whatsoever of any organization adopting such a change (analogous to the
redefinition of Celsius per BIPM's 10th CGPM of 1954).

To restate the question concisely: Does there exist an official (legal
or otherwise) definition of the Fahrenheit scale?; and if so, what
exactly is the definition, and what specific organization maintains
that definition?

DWIII
m***@cars3.uchicago.edu
2006-07-08 04:23:34 UTC
Permalink
Post by DWIII
Post by m***@cars3.uchicago.edu
Post by Randy Poe
Post by Andy Resnick
Post by DWIII
This is some minor issue that I have been wrestling with on my own for
some time now, and intensive web/newsgroup searching has come up empty.
Firstly I will go over the basics as I understand them.
<snip>
I'm not really sure what you are asking- First off, historically there
were many competing scales for temperature: Newton, Romer, Delisle,
Celsius, and Fahrenheit (among others). All used various
freezing/melting/boiling points of various forms of water, with varying
numbers of divisions between the two. Why Celsius and Fahrenheit's
became more widely used than anyone else's, I cannot say.
Kelvin and Rankine scales are absolute scales- they are thermodynamic
and have a proper conceptual foundation. The scales above, prior to
1860 or so were not, strictly speaking, well-defined measures of a
system (statistical mechanics provided the needed conceptual link
between energy and heat).
So I guess just as you reconcile the Kelin and Celsius scales, so you
can reconcile the Rankine and Fahrenheit scales. And Kelvin and Rankine
scales differ only in the energy interval corresponding to '1 degree'.
I think he's asking about that interval. Is a Kelvin exactly 1.8
degree Fahrenheit, or something slightly different? If
Fahrenheit is still defined as a scale where water freezes
a 32 deg and boils at 212 deg, but on the Celsius scale
those points are no longer 0 and 100, then the conversion
factor is no longer 1.8.
As I recall, Fahrenheit is currently defined in terms of Celsius
(thus, indirectly, of Kelvin), just as the inch is defined in terms of
meter. If somebody is interested enough, I'm sure NIST can provide
answers.
Fahrenheit may have been redefined (as you say) by being explicitly
pegged to Celsius at some point, but I still cannot find any reference
whatsoever of any organization adopting such a change (analogous to the
redefinition of Celsius per BIPM's 10th CGPM of 1954).
To restate the question concisely: Does there exist an official (legal
or otherwise) definition of the Fahrenheit scale?; and if so, what
exactly is the definition, and what specific organization maintains
that definition?
As I mentioned above, the place to contact is NIST. In the US, it is
their job to set and maintain such definitions.

Mati Meron | "When you argue with a fool,
***@cars.uchicago.edu | chances are he is doing just the same"
Andy Resnick
2006-07-10 12:52:38 UTC
Permalink
DWIII wrote:
<snip>
Post by DWIII
To restate the question concisely: Does there exist an official (legal
or otherwise) definition of the Fahrenheit scale?; and if so, what
exactly is the definition, and what specific organization maintains
that definition?
I doubt there's a concise official definition. This is not too
surprising in that there are also not clear (i.e. commerce-related)
definitions of many units: for example furlongs, fluid ounces, drams,
Seiverts, diopters, and thousands more. These units originated in
various ways and are generally pegged to combinations of SI base units
in order to make (for example) your eyeglass presciption applicable to
my test plate. So, as the SI base units are refined, so are all the
many derived units. The original documentation linking the customary
units to SI units probably was written when the country in question
adopted the SI unit scheme.

NIST has some applicable documents, try starting here:

http://physics.nist.gov/cuu/Units/bibliography.html

The Kelvin was adopted in 1954, so perhaps there was an official
conversion adopted at the same time.

I used to think that optical and radiological units were the worst. Now
I think it's the clinical drug dosages:

http://www.unc.edu/~rowlett/units/scales/clinical_data.html

I mean seriously... mEq? IU? Can these be any more random?
--
Andrew Resnick, Ph.D.
Department of Physiology and Biophysics
Case Western Reserve University
Sylvain Croussette
2006-07-10 14:11:28 UTC
Permalink
Post by Randy Poe
Post by Andy Resnick
Post by DWIII
This is some minor issue that I have been wrestling with on my own for
some time now, and intensive web/newsgroup searching has come up empty.
Firstly I will go over the basics as I understand them.
<snip>
...
Post by Randy Poe
Post by Andy Resnick
So I guess just as you reconcile the Kelin and Celsius scales, so you
can reconcile the Rankine and Fahrenheit scales. And Kelvin and Rankine
scales differ only in the energy interval corresponding to '1 degree'.
I think he's asking about that interval. Is a Kelvin exactly 1.8
degree Fahrenheit, or something slightly different? If
Fahrenheit is still defined as a scale where water freezes
a 32 deg and boils at 212 deg, but on the Celsius scale
those points are no longer 0 and 100, then the conversion
factor is no longer 1.8.
- Randy
The link here

http://physics.nist.gov/Pubs/SP811/appenB8.html

says to convert from degree F to degree C the formula is still C =
(F-32)/1.8. The constants are in bold, and at the top of the page it
says this means they are exact values. So I guess this means the F
scale is defined, at least in the US where NIST defines these things,
in terms of the C scale, and 32 and 212 deg F are no longer exact
values by definition for the temps of freezing and boiling water, like
in the Celsius scale 0 and 100 no longer are.

DWIII
2006-07-07 18:09:05 UTC
Permalink
Post by Andy Resnick
Post by DWIII
This is some minor issue that I have been wrestling with on my own for
some time now, and intensive web/newsgroup searching has come up empty.
Firstly I will go over the basics as I understand them.
<snip>
I'm not really sure what you are asking- First off, historically there
were many competing scales for temperature: Newton, Romer, Delisle,
Celsius, and Fahrenheit (among others). All used various
freezing/melting/boiling points of various forms of water, with varying
numbers of divisions between the two. Why Celsius and Fahrenheit's
became more widely used than anyone else's, I cannot say.
Kelvin and Rankine scales are absolute scales- they are thermodynamic
and have a proper conceptual foundation. The scales above, prior to
1860 or so were not, strictly speaking, well-defined measures of a
system (statistical mechanics provided the needed conceptual link
between energy and heat).
So I guess just as you reconcile the Kelin and Celsius scales, so you
can reconcile the Rankine and Fahrenheit scales. And Kelvin and Rankine
scales differ only in the energy interval corresponding to '1 degree'.
That makes sense; if we start with a ratio of exactly 1.8 degrees
Rankine per Kelvin, that would put the triple point at exactly 491.688
degrees Rankine, and the (approx) freezing point of 273.15 K at 491.670
degrees R, for a difference of exactly 0.018 degrees R. From which
follows that the triple point is exactly 32.018 degrees Fahrenheight (F
intervals taken as exactly equal to R intervals), and absolute zero at
exactly -459.670 degrees F.

Of course, this modifies the original Fahrenheit scale, which, like
Celsius, redefines it with fixed points -459.670 (absolute zero) and
32.018 (triple point). The major sticking points are that (1) we now
lose the original fixed points (which really isn't so bad, from a
metrological view), and (2) I'll bet you'll never convince the average
American who grew up with the old 32/212 scheme that this redefinition,
somehow, is better. Oh, well, you can't have everything...

DWIII
m***@cars3.uchicago.edu
2006-07-07 19:44:24 UTC
Permalink
Post by DWIII
Post by Andy Resnick
Post by DWIII
This is some minor issue that I have been wrestling with on my own for
some time now, and intensive web/newsgroup searching has come up empty.
Firstly I will go over the basics as I understand them.
<snip>
I'm not really sure what you are asking- First off, historically there
were many competing scales for temperature: Newton, Romer, Delisle,
Celsius, and Fahrenheit (among others). All used various
freezing/melting/boiling points of various forms of water, with varying
numbers of divisions between the two. Why Celsius and Fahrenheit's
became more widely used than anyone else's, I cannot say.
Kelvin and Rankine scales are absolute scales- they are thermodynamic
and have a proper conceptual foundation. The scales above, prior to
1860 or so were not, strictly speaking, well-defined measures of a
system (statistical mechanics provided the needed conceptual link
between energy and heat).
So I guess just as you reconcile the Kelin and Celsius scales, so you
can reconcile the Rankine and Fahrenheit scales. And Kelvin and Rankine
scales differ only in the energy interval corresponding to '1 degree'.
That makes sense; if we start with a ratio of exactly 1.8 degrees
Rankine per Kelvin, that would put the triple point at exactly 491.688
degrees Rankine, and the (approx) freezing point of 273.15 K at 491.670
degrees R, for a difference of exactly 0.018 degrees R. From which
follows that the triple point is exactly 32.018 degrees Fahrenheight (F
intervals taken as exactly equal to R intervals), and absolute zero at
exactly -459.670 degrees F.
Of course, this modifies the original Fahrenheit scale, which, like
Celsius, redefines it with fixed points -459.670 (absolute zero) and
32.018 (triple point). The major sticking points are that (1) we now
lose the original fixed points (which really isn't so bad, from a
metrological view), and (2) I'll bet you'll never convince the average
American who grew up with the old 32/212 scheme that this redefinition,
somehow, is better. Oh, well, you can't have everything...
Well, with regard to these two points:

1) Definitions of units are changing all the time, what else is new.
Consider how many times did the definition of the meter, or the
second, change since inception.

2) As much as it may be disheartening for an amateur (or
professional, for that matter) metrologist, the average American has
absolutely no reason to care about this redefinition and the 32/212
scheme is quite good enough for all but most exacting use.

Mati Meron | "When you argue with a fool,
***@cars.uchicago.edu | chances are he is doing just the same"
Henning Makholm
2006-07-08 09:52:39 UTC
Permalink
Post by m***@cars3.uchicago.edu
2) As much as it may be disheartening for an amateur (or
professional, for that matter) metrologist, the average American has
absolutely no reason to care about this redefinition and the 32/212
scheme is quite good enough for all but most exacting use.
Similarly, the average non-American neither knows nor cares that the
temperature scale he uses is not technically defined by water freezing
and boiling at 0/100 °C.
--
Henning Makholm "In my opinion, this child don't
need to have his head shrunk at all."
m***@cars3.uchicago.edu
2006-07-08 10:01:40 UTC
Permalink
Post by Henning Makholm
Post by m***@cars3.uchicago.edu
2) As much as it may be disheartening for an amateur (or
professional, for that matter) metrologist, the average American has
absolutely no reason to care about this redefinition and the 32/212
scheme is quite good enough for all but most exacting use.
Similarly, the average non-American neither knows nor cares that the
temperature scale he uses is not technically defined by water freezing
and boiling at 0/100 °C.
Exactly.

Mati Meron | "When you argue with a fool,
***@cars.uchicago.edu | chances are he is doing just the same"
Ben Newsam
2006-07-08 12:05:39 UTC
Permalink
On Sat, 08 Jul 2006 11:52:39 +0200, Henning Makholm
Post by Henning Makholm
Similarly, the average non-American neither knows nor cares that the
temperature scale he uses is not technically defined by water freezing
and boiling at 0/100 °C.
On the contrary, we care a great deal. When the temperature dips below
zero, we know it is freezing. No need to do a mental calculation
first.
m***@cars3.uchicago.edu
2006-07-08 13:29:38 UTC
Permalink
Post by Ben Newsam
On Sat, 08 Jul 2006 11:52:39 +0200, Henning Makholm
Post by Henning Makholm
Similarly, the average non-American neither knows nor cares that the
temperature scale he uses is not technically defined by water freezing
and boiling at 0/100 °C.
On the contrary, we care a great deal. When the temperature dips below
zero, we know it is freezing. No need to do a mental calculation
first.
Eh?

Mati Meron | "When you argue with a fool,
***@cars.uchicago.edu | chances are he is doing just the same"
Henning Makholm
2006-07-08 18:13:23 UTC
Permalink
Post by Ben Newsam
On Sat, 08 Jul 2006 11:52:39 +0200, Henning Makholm
Post by Henning Makholm
Similarly, the average non-American neither knows nor cares that the
temperature scale he uses is not technically defined by water freezing
and boiling at 0/100 °C.
On the contrary, we care a great deal. When the temperature dips below
zero, we know it is freezing. No need to do a mental calculation
first.
Are you suggesting that it somehow forces you to do a mental
calculation that "water freezes at 0 °C" is not anymore a definition
but merely an easily-remembered experimental fact?
--
Henning Makholm "Nemo enim fere saltat sobrius, nisi forte insanit."
Ben Newsam
2006-07-08 20:01:12 UTC
Permalink
On Sat, 08 Jul 2006 20:13:23 +0200, Henning Makholm
Post by Henning Makholm
Post by Ben Newsam
On Sat, 08 Jul 2006 11:52:39 +0200, Henning Makholm
Post by Henning Makholm
Similarly, the average non-American neither knows nor cares that the
temperature scale he uses is not technically defined by water freezing
and boiling at 0/100 °C.
On the contrary, we care a great deal. When the temperature dips below
zero, we know it is freezing. No need to do a mental calculation
first.
Are you suggesting that it somehow forces you to do a mental
calculation that "water freezes at 0 °C" is not anymore a definition
but merely an easily-remembered experimental fact?
Indeed not. It is a matter of supreme indifference to me, as the
effect is the same in any case.
Henning Makholm
2006-07-09 12:44:41 UTC
Permalink
Post by Ben Newsam
On Sat, 08 Jul 2006 20:13:23 +0200, Henning Makholm
Post by Henning Makholm
Post by Ben Newsam
On Sat, 08 Jul 2006 11:52:39 +0200, Henning Makholm
Post by Henning Makholm
Similarly, the average non-American neither knows nor cares that the
temperature scale he uses is not technically defined by water freezing
and boiling at 0/100 °C.
On the contrary, we care a great deal. When the temperature dips below
zero, we know it is freezing. No need to do a mental calculation
first.
Are you suggesting that it somehow forces you to do a mental
calculation that "water freezes at 0 °C" is not anymore a definition
but merely an easily-remembered experimental fact?
Indeed not. It is a matter of supreme indifference to me, as the
effect is the same in any case.
How can it be "a matter of supreme indifference" to you when you also
state that you "care a great deal" about it?
--
Henning Makholm "Ambiguous cases are defined as those for which the
compiler being used finds a legitimate interpretation
which is different from that which the user had in mind."
z***@netscape.net
2006-07-07 17:26:43 UTC
Permalink
Post by DWIII
This is some minor issue that I have been wrestling with on my own for
some time now, and intensive web/newsgroup searching has come up empty.
Firstly I will go over the basics as I understand them.
Both Fahrenheit and Celsius, after some historical numerological
juggling, eventually settled their respective scales on two fixed
points: the melting point (0 degrees C; 32 degrees F) and the boiling
point (100 degrees C; 212 degrees F) of water. Exactly under what
environmental conditions, purity of water sample, etc, which these
points were practically realized is a fine detail, but not salient to
my point here.
Both fahereheit and celius are just as well-defined as
any other temperture measurement systemm if you konw what
locally-linear means. Since all of science is ultimately
based on that concept.
Post by DWIII
Later, with the adoption of the Kelvin scale by SI, whose two fixed
points are taken to be absolute zero and the triple point of water,
separated by exactly 273.16 divisions, the Celsius scale was redefined
(for scientific purposes) in terms of kelvins to wit: 0.01 degrees
Celsius is now identified with the triple point at 273.16 K, and the
size of the degree Celsius is defined to be exactly equal to the Kelvin
(http://physics.nist.gov/cuu/Units/kelvin.html). Note that this
redefinition does away with the former Celsius fixed points (0, 100).
Of course, for informal everyday purposes, this redefinition makes no
practical difference; that is why the value "273.16" was chosen in the
first place: backwards compatibility. 0 degrees C and 100 degrees C
are still, as before, the respective melting and boiling points of
water within acceptable margins of error (let's say, +/- 0.02 degrees
for the boiling point (99.975 degrees C according to
http://www.lenntech.com/unit-conversion-calculator/temperature.htm)).
However, for purposes of metrology, it can no longer be claimed that
0/100 are the exact Celcius values of the melting/boiling points.
So, where does this leave Fahrenheit, as it is currently practiced in
my country (the archaic and scientifically backwards United States)?
I leasves it just where it was 200 years ago, when Faherneit
invented it. Outside of moron European Watts and Trains.
Post by DWIII
Virtually every source I have consulted brazenly states the usual
simple (and exact) mathematical conversion between C and F (we are all
quite familiar with that), and even go so far as saying that that
well-known conversion is strictly based on the 0/32 and 100/212
melting/boiling points. However, these points are are no longer
applicable (in particular, no longer applicable to Celsius). As I said
previously, this state of affairs is fine and dandy for practical
everyday stuff (e.g. weather reports), but not for an amateur
metrologist like me.
The two obvious alternatives are (1) Fahrenheit has, in fact, been
redefined in terms of either Kelvin, or by way of the new Celsius
definition (which amounts to the same thing), thereby detatching it
from its former fixed points while I wasn't looking, and thus the usual
conversion formula is exact, as it was formerly, or (2) Fahrenheit
still retains its old fixed points of 32/212 because it has never been
officially redefined as such (to my knowledge), and thus there now
exists no exact conversion between C and F as there used to be, except
by way of modern measurement with an unavoidable margin of error.
So is it (1) or (2)? Or can we say that (3) there now exists two
separate Celsius scales in common usage: one exactly convertible to F
alone (in the US), and the other exactly convertible to K alone (in the
rest of the world)?
Also note that Russ Rowlett
(http://www.unc.edu/~rowlett/units/dictK.html) blandly states under his
entry for "kelvin (K)": "The kelvin equals exactly 1.8 degrees
Fahrenheit", which is entirely unacceptable in light of his definition
for "degree Fahrenheit (°F)": "... On this scale, the freezing point
of water (at normal sea level atmospheric pressure) turned out to be
about 32 °F and the boiling point about 212 °F. Eventually the scale
was precisely defined by these two temperatures."
I can't quite recall if NIST had ever explicitly said the Fahrenheit
was not, or no longer an acceptable unit for use with SI, which I'm
almost quite sure it isn't anyway. They don't even bother to mention
Fahrenheit on their current list of deprecated units
(http://physics.nist.gov/cuu/Units/outside.html), one way or the other,
which suggests that it was chucked outright while I wasn't looking.
And no, I don't suspect a conspiracy, either.
I'll bet Mr Nygaard knows what's going on here! :-)
DWIII
ABarlow
2006-07-08 11:43:02 UTC
Permalink
Post by DWIII
This is some minor issue that I have been wrestling with on my own for
some time now, and intensive web/newsgroup searching has come up empty.
Firstly I will go over the basics as I understand them.
Both Fahrenheit and Celsius, after some historical numerological
juggling, eventually settled their respective scales on two fixed
points: the melting point (0 degrees C; 32 degrees F) and the boiling
point (100 degrees C; 212 degrees F) of water. Exactly under what
environmental conditions, purity of water sample, etc, which these
points were practically realized is a fine detail, but not salient to
my point here.
Later, with the adoption of the Kelvin scale by SI, whose two fixed
points are taken to be absolute zero and the triple point of water,
separated by exactly 273.16 divisions, the Celsius scale was redefined
(for scientific purposes) in terms of kelvins to wit: 0.01 degrees
Celsius is now identified with the triple point at 273.16 K, and the
size of the degree Celsius is defined to be exactly equal to the Kelvin
(http://physics.nist.gov/cuu/Units/kelvin.html). Note that this
redefinition does away with the former Celsius fixed points (0, 100).
Of course, for informal everyday purposes, this redefinition makes no
practical difference; that is why the value "273.16" was chosen in the
first place: backwards compatibility. 0 degrees C and 100 degrees C
are still, as before, the respective melting and boiling points of
water within acceptable margins of error (let's say, +/- 0.02 degrees
for the boiling point (99.975 degrees C according to
http://www.lenntech.com/unit-conversion-calculator/temperature.htm)).
However, for purposes of metrology, it can no longer be claimed that
0/100 are the exact Celcius values of the melting/boiling points.
So, where does this leave Fahrenheit, as it is currently practiced in
my country (the archaic and scientifically backwards United States)?
Virtually every source I have consulted brazenly states the usual
simple (and exact) mathematical conversion between C and F (we are all
quite familiar with that), and even go so far as saying that that
well-known conversion is strictly based on the 0/32 and 100/212
melting/boiling points. However, these points are are no longer
applicable (in particular, no longer applicable to Celsius). As I said
previously, this state of affairs is fine and dandy for practical
everyday stuff (e.g. weather reports), but not for an amateur
metrologist like me.
The two obvious alternatives are (1) Fahrenheit has, in fact, been
redefined in terms of either Kelvin, or by way of the new Celsius
definition (which amounts to the same thing), thereby detatching it
from its former fixed points while I wasn't looking, and thus the usual
conversion formula is exact, as it was formerly, or (2) Fahrenheit
still retains its old fixed points of 32/212 because it has never been
officially redefined as such (to my knowledge), and thus there now
exists no exact conversion between C and F as there used to be, except
by way of modern measurement with an unavoidable margin of error.
So is it (1) or (2)? Or can we say that (3) there now exists two
separate Celsius scales in common usage: one exactly convertible to F
alone (in the US), and the other exactly convertible to K alone (in the
rest of the world)?
Also note that Russ Rowlett
(http://www.unc.edu/~rowlett/units/dictK.html) blandly states under his
entry for "kelvin (K)": "The kelvin equals exactly 1.8 degrees
Fahrenheit", which is entirely unacceptable in light of his definition
for "degree Fahrenheit (°F)": "... On this scale, the freezing point
of water (at normal sea level atmospheric pressure) turned out to be
about 32 °F and the boiling point about 212 °F. Eventually the scale
was precisely defined by these two temperatures."
I can't quite recall if NIST had ever explicitly said the Fahrenheit
was not, or no longer an acceptable unit for use with SI, which I'm
almost quite sure it isn't anyway. They don't even bother to mention
Fahrenheit on their current list of deprecated units
(http://physics.nist.gov/cuu/Units/outside.html), one way or the other,
which suggests that it was chucked outright while I wasn't looking.
And no, I don't suspect a conspiracy, either.
I'll bet Mr Nygaard knows what's going on here! :-)
DWIII
Well, the technical answer is that all temperature scales stem from the
0th Law of Thermodynamics, and that any temperature scale is
appropriate, and all temperature scales are convertable. For increase,
a mercury in glass temperature scale measures the expansion of a
specific amount of mercury between two fixed and arbitary points. The
same is true of pretty much every other temperature scale. I believe
that the Fahrenheit scale happens to be defined in terms of the melting
and boiling points of saltpeter, or something like that.

A.
Loading...