Discussion:
radiotap DB vs DBM, signedness
(too old to reply)
Greg Troxel
2006-09-13 18:47:29 UTC
Permalink
I've recently been trying to make more careful signal and noise
measurements with ath(4) cards. (You can see my fleet of All-Terrain
Radio Flyer Wagons, complete with Novatel GPS units with local
differential corrections at http://www.lexort.com/gallery/random-geek )

I've had two sources of consternation: noise values and calibration
intervals, and the sign and units of rssi/nf. Sam Leffler has helped
me out greatly off-list and I'm writing to propose fixes and explain
what I've learned.

1) Calibration and noise values

I'm still gathering data, but basically the noise values don't change
often. Setting ath.calibrate to 1 (from 30) should result in more
frequent calibration and hence more accurate/recent noise
measurements.

2) radiotap and signal representation

Radiotap has both

IEEE80211_RADIOTAP_DBM_ANTSIGNAL

and

IEEE80211_RADIOTAP_DB_ANTSIGNAL

for signal strength. These are relative to dBm and to an arbitrary
reference. While it might well be good to have the latter, the useful
thing to do is have some rough calibration done and the constant put
in the driver.

It turns out Atheros chips return values in dBm, so I have a local
patch to use IEEE80211_RADIOTAP_DBM_ANT{SIGNAL,NOISE} instead of the
DB flavors.

I also patched tcpdump to show dBm when DBM is used, rather than dB,
so one can tell what's going on.

With those changes, I see packets like this:

13:57:40.467100 965620941716us tsft 1.0 Mb/s 2462 MHz (0x0480) -41dBm signal -96dBm noise antenna 1 Beacon (adroit-collab) [1.0* 2.0* 5.5* 6.0 9.0 11.0* 12.0 18.0 Mbit] IBSS CH: 11, PRIVACY

which is much more useful and tied to reality than the previous values
of 201 and 160 dB, which turned out to be dB relative to -256 dBm.


The other problem is that IEEE80211_RADIOTAP_DB_ANT is in u_int8_t
rather than int8_t. For dB, this doesn't make sense.

So, barring objections I'm going to

fix sys/dev/ic/athioctl.h to use DBM flavors of radiotap fields
fix tcpdump to print dBm for DBM

I'd also like to adjust the DB fields to be signed, and make the
change in tcpdump. Then everything that uses it can change, or can
ask the question: how can I make this sort of dBm. But it's sort of
an ABI change for radiotap. Thoughts?

Greg Troxel <***@ir.bbn.com>
David Young
2006-09-19 17:48:34 UTC
Permalink
Post by Greg Troxel
The other problem is that IEEE80211_RADIOTAP_DB_ANT is in u_int8_t
rather than int8_t. For dB, this doesn't make sense.
Why doesn't it make sense? If you choose the reference point right, then
all dB values will fit between 0 and 255. Maybe I am missing something.

Dave
--
David Young OJC Technologies
***@ojctech.com Urbana, IL * (217) 278-3933

--
Posted automagically by a mail2news gateway at muc.de e.V.
Please direct questions, flames, donations, etc. to news-***@muc.de
Greg Troxel
2006-09-20 12:48:05 UTC
Permalink
Post by David Young
Post by Greg Troxel
The other problem is that IEEE80211_RADIOTAP_DB_ANT is in u_int8_t
rather than int8_t. For dB, this doesn't make sense.
Why doesn't it make sense? If you choose the reference point right, then
all dB values will fit between 0 and 255. Maybe I am missing something.
[I know you know some of what I'm saying, but I'm including it for the
broader audience.]

It doesn't make sense because dB relative to a reference is
fundamentally a signed quantity. We're talking about how to encode a
real number in a computer representation. So the two choices are what
degree of quantization and the range. We now have fixed-point with
256 values, and that seems fine. So therefore we can either have -128
to +127 relative to a reference, or we can have +0 to +255. With
signed, the system is well behaved for any reasonable choice of
reference. With unsigned, the reference has to be below the minimum
value that will ever happen or one gets wrapping.

1 mW (used for dBm) is the standard reference level, and in my view an
'unspecified reference' is simply a way to say that the calibration to
dBm is lame or unknown.

With a value that's actually in dBm, and the current DB_ANT encoding,
I got values like 160 dB for noise and 220 dB for a strong signal.
This is relative to -256 dBm, which is an odd choice. And then values
like +20 dBm can't be represented on the same scale. Had this been
signed, I would have obtained -96 and -46 dB, which would have been
decent values for "uncalibrated dBm" in this case.


I think my real point is that you shouldn't be forced to choose a
reference point that's on the low end of all possible values. With a
signed representation and the notion that a reference closer to 1 mW
is better, there's no grief.
--
Greg Troxel <***@ir.bbn.com>
David Young
2006-09-21 19:49:39 UTC
Permalink
Post by Greg Troxel
Post by David Young
Post by Greg Troxel
The other problem is that IEEE80211_RADIOTAP_DB_ANT is in u_int8_t
rather than int8_t. For dB, this doesn't make sense.
Why doesn't it make sense? If you choose the reference point right, then
all dB values will fit between 0 and 255. Maybe I am missing something.
[I know you know some of what I'm saying, but I'm including it for the
broader audience.]
I should say, first, that I am happy for radiotap to sprout a new,
signed-dB field, if that is required.
Post by Greg Troxel
It doesn't make sense because dB relative to a reference is
fundamentally a signed quantity. We're talking about how to encode a
real number in a computer representation. So the two choices are what
degree of quantization and the range. We now have fixed-point with
256 values, and that seems fine. So therefore we can either have -128
to +127 relative to a reference, or we can have +0 to +255. With
signed, the system is well behaved for any reasonable choice of
reference. With unsigned, the reference has to be below the minimum
value that will ever happen or one gets wrapping.
In practice, choosing a reasonable reference will be easy. One reads
the dB from a fixed-width hardware register, or a DMA descriptor, and
the minimum value is either zero (unsigned register) or -128 (signed),
-64, -32, ..., depending on the width of the register.
Post by Greg Troxel
1 mW (used for dBm) is the standard reference level, and in my view an
'unspecified reference' is simply a way to say that the calibration to
dBm is lame or unknown.
Lame or unknown is right. My aim with radiotap has been to give a driver
developer a choice of fields, ranging from the perfectly vague to the
scientifically precise, with a vague field always preferred to a lying
field. :-)
Post by Greg Troxel
With a value that's actually in dBm, and the current DB_ANT encoding,
I got values like 160 dB for noise and 220 dB for a strong signal.
This is relative to -256 dBm, which is an odd choice. And then values
like +20 dBm can't be represented on the same scale. Had this been
signed, I would have obtained -96 and -46 dB, which would have been
decent values for "uncalibrated dBm" in this case.
I understand and appreciate the principled argument you are making. As a
practical matter, however, 20 dBm is an astonishingly high RSSI! I figure
it is a value way higher than the hardware's descriptor/register can even
express. I think of the _DB_ANT fields as holding raw, uncalibrated,
log-scale measurements from the hardware, whose range and reference is
set by the range of some hardware register, be it 5, 6, ..., or 8 bits
wide, signed or unsigned.
Post by Greg Troxel
I think my real point is that you shouldn't be forced to choose a
reference point that's on the low end of all possible values. With a
signed representation and the notion that a reference closer to 1 mW
is better, there's no grief.
Ok. My point is that while you are forced to choose such a reference
point, it's easy. :-)

Dave
--
David Young OJC Technologies
***@ojctech.com Urbana, IL * (217) 278-3933

--
Posted automagically by a mail2news gateway at muc.de e.V.
Please direct questions, flames, donations, etc. to news-***@muc.de
Loading...