significant figures in averaging samples
$begingroup$
I can't seem to find anything about this but I thought that for every 10 samples (of the same thing) that you averaged together you gained 1 significant figure. You'd maybe need 100 samples to gain 2 figures.
I'm talking about the context where maybe a computer or pulse generator is firing a pulse laser in a loop and you're looking at a change in absorbance or fluorescence with a photodiode or photomultiplier tube. You see a trace on a digital storage scope and you average hundreds or thousands of them for improved accuracy.
I used to work for chemists, but my job was electronics to their specifications. I built the amplifier for the photodiode, interfaced the storage scope to computers. I know improving accuracy was the purpose of averaging, don't remember how the significant figures worked out. It was also 30+ years ago.
average significant-figures
$endgroup$
|
show 1 more comment
$begingroup$
I can't seem to find anything about this but I thought that for every 10 samples (of the same thing) that you averaged together you gained 1 significant figure. You'd maybe need 100 samples to gain 2 figures.
I'm talking about the context where maybe a computer or pulse generator is firing a pulse laser in a loop and you're looking at a change in absorbance or fluorescence with a photodiode or photomultiplier tube. You see a trace on a digital storage scope and you average hundreds or thousands of them for improved accuracy.
I used to work for chemists, but my job was electronics to their specifications. I built the amplifier for the photodiode, interfaced the storage scope to computers. I know improving accuracy was the purpose of averaging, don't remember how the significant figures worked out. It was also 30+ years ago.
average significant-figures
$endgroup$
$begingroup$
Hi & welcome to MSE. I'm not an expert in statistics, so maybe somebody else can give you a more definitive response. I don't think there's any specific number of values to average to get an extra significant digit of accuracy. It depends a lot on how much variance there is in the data. If it's all very close to each other, then even several may work. However, if for example, there is a $1000$ times variation between the minimum and maximum values, with you also getting everything in between, it'll require many more values for you to have reasonable confidence in an extra digit of accuracy.
$endgroup$
– John Omielan
Jan 13 at 1:30
$begingroup$
No, nothing like 1000 times variance, this was more like a signal to noise ratio thing. Ideally the samples would have been the same but there were various error sources that only averaging could really hope to diminish.
$endgroup$
– Alan Corey
Jan 13 at 1:48
$begingroup$
I was just stating $1000$ to give emphasis to how important the variance can be. As for it being the "signal to noise ratio" thing (as well as any other error sources), the answer to your question will depend largely on the range & variance of the errors you may reasonably expect to encounter. This is something that obviously depends on your particular situation, such as the type of machine you're using, so there's something you should try to determine, or at least get a rough idea of them. Somebody here can likely give you a reasonable formula of some sort to use once you have these values.
$endgroup$
– John Omielan
Jan 13 at 1:53
$begingroup$
Welcome to Mathematics Stack Exchange community! The quick tour will help you get the most benefit from your time here.
$endgroup$
– dantopa
Jan 13 at 2:28
$begingroup$
Actually what got me started here is that I monitor outdoor temperatures with an Oregon Scientific thermometer but also record the values transmitted over radio on a computer. In 2.5 years I've gotten 1.5 million data points. If I say the mean temperature for a given month was x, how should I limit the precision in my printf? By default I see 6 places. It's about 1 data point per minute.
$endgroup$
– Alan Corey
Jan 13 at 13:49
|
show 1 more comment
$begingroup$
I can't seem to find anything about this but I thought that for every 10 samples (of the same thing) that you averaged together you gained 1 significant figure. You'd maybe need 100 samples to gain 2 figures.
I'm talking about the context where maybe a computer or pulse generator is firing a pulse laser in a loop and you're looking at a change in absorbance or fluorescence with a photodiode or photomultiplier tube. You see a trace on a digital storage scope and you average hundreds or thousands of them for improved accuracy.
I used to work for chemists, but my job was electronics to their specifications. I built the amplifier for the photodiode, interfaced the storage scope to computers. I know improving accuracy was the purpose of averaging, don't remember how the significant figures worked out. It was also 30+ years ago.
average significant-figures
$endgroup$
I can't seem to find anything about this but I thought that for every 10 samples (of the same thing) that you averaged together you gained 1 significant figure. You'd maybe need 100 samples to gain 2 figures.
I'm talking about the context where maybe a computer or pulse generator is firing a pulse laser in a loop and you're looking at a change in absorbance or fluorescence with a photodiode or photomultiplier tube. You see a trace on a digital storage scope and you average hundreds or thousands of them for improved accuracy.
I used to work for chemists, but my job was electronics to their specifications. I built the amplifier for the photodiode, interfaced the storage scope to computers. I know improving accuracy was the purpose of averaging, don't remember how the significant figures worked out. It was also 30+ years ago.
average significant-figures
average significant-figures
asked Jan 13 at 1:21
Alan CoreyAlan Corey
61
61
$begingroup$
Hi & welcome to MSE. I'm not an expert in statistics, so maybe somebody else can give you a more definitive response. I don't think there's any specific number of values to average to get an extra significant digit of accuracy. It depends a lot on how much variance there is in the data. If it's all very close to each other, then even several may work. However, if for example, there is a $1000$ times variation between the minimum and maximum values, with you also getting everything in between, it'll require many more values for you to have reasonable confidence in an extra digit of accuracy.
$endgroup$
– John Omielan
Jan 13 at 1:30
$begingroup$
No, nothing like 1000 times variance, this was more like a signal to noise ratio thing. Ideally the samples would have been the same but there were various error sources that only averaging could really hope to diminish.
$endgroup$
– Alan Corey
Jan 13 at 1:48
$begingroup$
I was just stating $1000$ to give emphasis to how important the variance can be. As for it being the "signal to noise ratio" thing (as well as any other error sources), the answer to your question will depend largely on the range & variance of the errors you may reasonably expect to encounter. This is something that obviously depends on your particular situation, such as the type of machine you're using, so there's something you should try to determine, or at least get a rough idea of them. Somebody here can likely give you a reasonable formula of some sort to use once you have these values.
$endgroup$
– John Omielan
Jan 13 at 1:53
$begingroup$
Welcome to Mathematics Stack Exchange community! The quick tour will help you get the most benefit from your time here.
$endgroup$
– dantopa
Jan 13 at 2:28
$begingroup$
Actually what got me started here is that I monitor outdoor temperatures with an Oregon Scientific thermometer but also record the values transmitted over radio on a computer. In 2.5 years I've gotten 1.5 million data points. If I say the mean temperature for a given month was x, how should I limit the precision in my printf? By default I see 6 places. It's about 1 data point per minute.
$endgroup$
– Alan Corey
Jan 13 at 13:49
|
show 1 more comment
$begingroup$
Hi & welcome to MSE. I'm not an expert in statistics, so maybe somebody else can give you a more definitive response. I don't think there's any specific number of values to average to get an extra significant digit of accuracy. It depends a lot on how much variance there is in the data. If it's all very close to each other, then even several may work. However, if for example, there is a $1000$ times variation between the minimum and maximum values, with you also getting everything in between, it'll require many more values for you to have reasonable confidence in an extra digit of accuracy.
$endgroup$
– John Omielan
Jan 13 at 1:30
$begingroup$
No, nothing like 1000 times variance, this was more like a signal to noise ratio thing. Ideally the samples would have been the same but there were various error sources that only averaging could really hope to diminish.
$endgroup$
– Alan Corey
Jan 13 at 1:48
$begingroup$
I was just stating $1000$ to give emphasis to how important the variance can be. As for it being the "signal to noise ratio" thing (as well as any other error sources), the answer to your question will depend largely on the range & variance of the errors you may reasonably expect to encounter. This is something that obviously depends on your particular situation, such as the type of machine you're using, so there's something you should try to determine, or at least get a rough idea of them. Somebody here can likely give you a reasonable formula of some sort to use once you have these values.
$endgroup$
– John Omielan
Jan 13 at 1:53
$begingroup$
Welcome to Mathematics Stack Exchange community! The quick tour will help you get the most benefit from your time here.
$endgroup$
– dantopa
Jan 13 at 2:28
$begingroup$
Actually what got me started here is that I monitor outdoor temperatures with an Oregon Scientific thermometer but also record the values transmitted over radio on a computer. In 2.5 years I've gotten 1.5 million data points. If I say the mean temperature for a given month was x, how should I limit the precision in my printf? By default I see 6 places. It's about 1 data point per minute.
$endgroup$
– Alan Corey
Jan 13 at 13:49
$begingroup$
Hi & welcome to MSE. I'm not an expert in statistics, so maybe somebody else can give you a more definitive response. I don't think there's any specific number of values to average to get an extra significant digit of accuracy. It depends a lot on how much variance there is in the data. If it's all very close to each other, then even several may work. However, if for example, there is a $1000$ times variation between the minimum and maximum values, with you also getting everything in between, it'll require many more values for you to have reasonable confidence in an extra digit of accuracy.
$endgroup$
– John Omielan
Jan 13 at 1:30
$begingroup$
Hi & welcome to MSE. I'm not an expert in statistics, so maybe somebody else can give you a more definitive response. I don't think there's any specific number of values to average to get an extra significant digit of accuracy. It depends a lot on how much variance there is in the data. If it's all very close to each other, then even several may work. However, if for example, there is a $1000$ times variation between the minimum and maximum values, with you also getting everything in between, it'll require many more values for you to have reasonable confidence in an extra digit of accuracy.
$endgroup$
– John Omielan
Jan 13 at 1:30
$begingroup$
No, nothing like 1000 times variance, this was more like a signal to noise ratio thing. Ideally the samples would have been the same but there were various error sources that only averaging could really hope to diminish.
$endgroup$
– Alan Corey
Jan 13 at 1:48
$begingroup$
No, nothing like 1000 times variance, this was more like a signal to noise ratio thing. Ideally the samples would have been the same but there were various error sources that only averaging could really hope to diminish.
$endgroup$
– Alan Corey
Jan 13 at 1:48
$begingroup$
I was just stating $1000$ to give emphasis to how important the variance can be. As for it being the "signal to noise ratio" thing (as well as any other error sources), the answer to your question will depend largely on the range & variance of the errors you may reasonably expect to encounter. This is something that obviously depends on your particular situation, such as the type of machine you're using, so there's something you should try to determine, or at least get a rough idea of them. Somebody here can likely give you a reasonable formula of some sort to use once you have these values.
$endgroup$
– John Omielan
Jan 13 at 1:53
$begingroup$
I was just stating $1000$ to give emphasis to how important the variance can be. As for it being the "signal to noise ratio" thing (as well as any other error sources), the answer to your question will depend largely on the range & variance of the errors you may reasonably expect to encounter. This is something that obviously depends on your particular situation, such as the type of machine you're using, so there's something you should try to determine, or at least get a rough idea of them. Somebody here can likely give you a reasonable formula of some sort to use once you have these values.
$endgroup$
– John Omielan
Jan 13 at 1:53
$begingroup$
Welcome to Mathematics Stack Exchange community! The quick tour will help you get the most benefit from your time here.
$endgroup$
– dantopa
Jan 13 at 2:28
$begingroup$
Welcome to Mathematics Stack Exchange community! The quick tour will help you get the most benefit from your time here.
$endgroup$
– dantopa
Jan 13 at 2:28
$begingroup$
Actually what got me started here is that I monitor outdoor temperatures with an Oregon Scientific thermometer but also record the values transmitted over radio on a computer. In 2.5 years I've gotten 1.5 million data points. If I say the mean temperature for a given month was x, how should I limit the precision in my printf? By default I see 6 places. It's about 1 data point per minute.
$endgroup$
– Alan Corey
Jan 13 at 13:49
$begingroup$
Actually what got me started here is that I monitor outdoor temperatures with an Oregon Scientific thermometer but also record the values transmitted over radio on a computer. In 2.5 years I've gotten 1.5 million data points. If I say the mean temperature for a given month was x, how should I limit the precision in my printf? By default I see 6 places. It's about 1 data point per minute.
$endgroup$
– Alan Corey
Jan 13 at 13:49
|
show 1 more comment
1 Answer
1
active
oldest
votes
$begingroup$
I think the statistical fact you need to know in this case is that if $X_1, X_2, X_3, dots , X_n$ are independent, identically distributed random variables, each with standard deviation $sigma$, then the standard deviation of their mean, $overline{X}$, is $sigma / sqrt{n}$. So if you have some estimate of the error in each of your $n$ temperature readings, then just divide that error estimate by $sqrt{n}$ to find the error in the average.
In terms of significant digits, this means that in order to get one more significant digit in your results, you need 100 times as many samples as in your original data. If you want two more significant digits, you need to increase your number of samples by a factor of 10,000.
All this assumes that your readings are in fact independent and unbiased. If that assumption is false, all bets are off.
Reference: An Introduction to Error Analysis: the Study of Uncertainties in Physical Measurements, Second Edition by John R. Taylor.
$endgroup$
$begingroup$
OK, there's a mishmash of concepts here too. I get at most 3 significant figures (I've seen 21.6 C) but the A/.D converter in the sensor works in bits of resolution. Natively it does degrees C but 21.6 is probably 216 counts. I doubt it's an 8-bit A/D but it could be 10.
$endgroup$
– Alan Corey
Jan 13 at 17:21
$begingroup$
Looking at last year's data I see 509966 samples for 12 months so each month is ~42497. I guess I'll set the printf to %.3f and keep 3 digits after the decimal. I'm partly looking at how difficult it is to see global warming. You sort of have to know it's there, it doesn't jump out at you. But I've only got 2.5 years of data too.
$endgroup$
– Alan Corey
Jan 13 at 17:29
$begingroup$
For anybody that might want to do this, get an Oregon Scientific sensor and an RTL2832 dongle. Download rtl_433 at github.com/merbanan/rtl_433. It runs on a Raspberry Pi, maybe a Zero, so you can do this for well under $50.
$endgroup$
– Alan Corey
Jan 13 at 17:34
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3071608%2fsignificant-figures-in-averaging-samples%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
I think the statistical fact you need to know in this case is that if $X_1, X_2, X_3, dots , X_n$ are independent, identically distributed random variables, each with standard deviation $sigma$, then the standard deviation of their mean, $overline{X}$, is $sigma / sqrt{n}$. So if you have some estimate of the error in each of your $n$ temperature readings, then just divide that error estimate by $sqrt{n}$ to find the error in the average.
In terms of significant digits, this means that in order to get one more significant digit in your results, you need 100 times as many samples as in your original data. If you want two more significant digits, you need to increase your number of samples by a factor of 10,000.
All this assumes that your readings are in fact independent and unbiased. If that assumption is false, all bets are off.
Reference: An Introduction to Error Analysis: the Study of Uncertainties in Physical Measurements, Second Edition by John R. Taylor.
$endgroup$
$begingroup$
OK, there's a mishmash of concepts here too. I get at most 3 significant figures (I've seen 21.6 C) but the A/.D converter in the sensor works in bits of resolution. Natively it does degrees C but 21.6 is probably 216 counts. I doubt it's an 8-bit A/D but it could be 10.
$endgroup$
– Alan Corey
Jan 13 at 17:21
$begingroup$
Looking at last year's data I see 509966 samples for 12 months so each month is ~42497. I guess I'll set the printf to %.3f and keep 3 digits after the decimal. I'm partly looking at how difficult it is to see global warming. You sort of have to know it's there, it doesn't jump out at you. But I've only got 2.5 years of data too.
$endgroup$
– Alan Corey
Jan 13 at 17:29
$begingroup$
For anybody that might want to do this, get an Oregon Scientific sensor and an RTL2832 dongle. Download rtl_433 at github.com/merbanan/rtl_433. It runs on a Raspberry Pi, maybe a Zero, so you can do this for well under $50.
$endgroup$
– Alan Corey
Jan 13 at 17:34
add a comment |
$begingroup$
I think the statistical fact you need to know in this case is that if $X_1, X_2, X_3, dots , X_n$ are independent, identically distributed random variables, each with standard deviation $sigma$, then the standard deviation of their mean, $overline{X}$, is $sigma / sqrt{n}$. So if you have some estimate of the error in each of your $n$ temperature readings, then just divide that error estimate by $sqrt{n}$ to find the error in the average.
In terms of significant digits, this means that in order to get one more significant digit in your results, you need 100 times as many samples as in your original data. If you want two more significant digits, you need to increase your number of samples by a factor of 10,000.
All this assumes that your readings are in fact independent and unbiased. If that assumption is false, all bets are off.
Reference: An Introduction to Error Analysis: the Study of Uncertainties in Physical Measurements, Second Edition by John R. Taylor.
$endgroup$
$begingroup$
OK, there's a mishmash of concepts here too. I get at most 3 significant figures (I've seen 21.6 C) but the A/.D converter in the sensor works in bits of resolution. Natively it does degrees C but 21.6 is probably 216 counts. I doubt it's an 8-bit A/D but it could be 10.
$endgroup$
– Alan Corey
Jan 13 at 17:21
$begingroup$
Looking at last year's data I see 509966 samples for 12 months so each month is ~42497. I guess I'll set the printf to %.3f and keep 3 digits after the decimal. I'm partly looking at how difficult it is to see global warming. You sort of have to know it's there, it doesn't jump out at you. But I've only got 2.5 years of data too.
$endgroup$
– Alan Corey
Jan 13 at 17:29
$begingroup$
For anybody that might want to do this, get an Oregon Scientific sensor and an RTL2832 dongle. Download rtl_433 at github.com/merbanan/rtl_433. It runs on a Raspberry Pi, maybe a Zero, so you can do this for well under $50.
$endgroup$
– Alan Corey
Jan 13 at 17:34
add a comment |
$begingroup$
I think the statistical fact you need to know in this case is that if $X_1, X_2, X_3, dots , X_n$ are independent, identically distributed random variables, each with standard deviation $sigma$, then the standard deviation of their mean, $overline{X}$, is $sigma / sqrt{n}$. So if you have some estimate of the error in each of your $n$ temperature readings, then just divide that error estimate by $sqrt{n}$ to find the error in the average.
In terms of significant digits, this means that in order to get one more significant digit in your results, you need 100 times as many samples as in your original data. If you want two more significant digits, you need to increase your number of samples by a factor of 10,000.
All this assumes that your readings are in fact independent and unbiased. If that assumption is false, all bets are off.
Reference: An Introduction to Error Analysis: the Study of Uncertainties in Physical Measurements, Second Edition by John R. Taylor.
$endgroup$
I think the statistical fact you need to know in this case is that if $X_1, X_2, X_3, dots , X_n$ are independent, identically distributed random variables, each with standard deviation $sigma$, then the standard deviation of their mean, $overline{X}$, is $sigma / sqrt{n}$. So if you have some estimate of the error in each of your $n$ temperature readings, then just divide that error estimate by $sqrt{n}$ to find the error in the average.
In terms of significant digits, this means that in order to get one more significant digit in your results, you need 100 times as many samples as in your original data. If you want two more significant digits, you need to increase your number of samples by a factor of 10,000.
All this assumes that your readings are in fact independent and unbiased. If that assumption is false, all bets are off.
Reference: An Introduction to Error Analysis: the Study of Uncertainties in Physical Measurements, Second Edition by John R. Taylor.
edited Jan 13 at 16:19
answered Jan 13 at 15:29
awkwardawkward
6,12011022
6,12011022
$begingroup$
OK, there's a mishmash of concepts here too. I get at most 3 significant figures (I've seen 21.6 C) but the A/.D converter in the sensor works in bits of resolution. Natively it does degrees C but 21.6 is probably 216 counts. I doubt it's an 8-bit A/D but it could be 10.
$endgroup$
– Alan Corey
Jan 13 at 17:21
$begingroup$
Looking at last year's data I see 509966 samples for 12 months so each month is ~42497. I guess I'll set the printf to %.3f and keep 3 digits after the decimal. I'm partly looking at how difficult it is to see global warming. You sort of have to know it's there, it doesn't jump out at you. But I've only got 2.5 years of data too.
$endgroup$
– Alan Corey
Jan 13 at 17:29
$begingroup$
For anybody that might want to do this, get an Oregon Scientific sensor and an RTL2832 dongle. Download rtl_433 at github.com/merbanan/rtl_433. It runs on a Raspberry Pi, maybe a Zero, so you can do this for well under $50.
$endgroup$
– Alan Corey
Jan 13 at 17:34
add a comment |
$begingroup$
OK, there's a mishmash of concepts here too. I get at most 3 significant figures (I've seen 21.6 C) but the A/.D converter in the sensor works in bits of resolution. Natively it does degrees C but 21.6 is probably 216 counts. I doubt it's an 8-bit A/D but it could be 10.
$endgroup$
– Alan Corey
Jan 13 at 17:21
$begingroup$
Looking at last year's data I see 509966 samples for 12 months so each month is ~42497. I guess I'll set the printf to %.3f and keep 3 digits after the decimal. I'm partly looking at how difficult it is to see global warming. You sort of have to know it's there, it doesn't jump out at you. But I've only got 2.5 years of data too.
$endgroup$
– Alan Corey
Jan 13 at 17:29
$begingroup$
For anybody that might want to do this, get an Oregon Scientific sensor and an RTL2832 dongle. Download rtl_433 at github.com/merbanan/rtl_433. It runs on a Raspberry Pi, maybe a Zero, so you can do this for well under $50.
$endgroup$
– Alan Corey
Jan 13 at 17:34
$begingroup$
OK, there's a mishmash of concepts here too. I get at most 3 significant figures (I've seen 21.6 C) but the A/.D converter in the sensor works in bits of resolution. Natively it does degrees C but 21.6 is probably 216 counts. I doubt it's an 8-bit A/D but it could be 10.
$endgroup$
– Alan Corey
Jan 13 at 17:21
$begingroup$
OK, there's a mishmash of concepts here too. I get at most 3 significant figures (I've seen 21.6 C) but the A/.D converter in the sensor works in bits of resolution. Natively it does degrees C but 21.6 is probably 216 counts. I doubt it's an 8-bit A/D but it could be 10.
$endgroup$
– Alan Corey
Jan 13 at 17:21
$begingroup$
Looking at last year's data I see 509966 samples for 12 months so each month is ~42497. I guess I'll set the printf to %.3f and keep 3 digits after the decimal. I'm partly looking at how difficult it is to see global warming. You sort of have to know it's there, it doesn't jump out at you. But I've only got 2.5 years of data too.
$endgroup$
– Alan Corey
Jan 13 at 17:29
$begingroup$
Looking at last year's data I see 509966 samples for 12 months so each month is ~42497. I guess I'll set the printf to %.3f and keep 3 digits after the decimal. I'm partly looking at how difficult it is to see global warming. You sort of have to know it's there, it doesn't jump out at you. But I've only got 2.5 years of data too.
$endgroup$
– Alan Corey
Jan 13 at 17:29
$begingroup$
For anybody that might want to do this, get an Oregon Scientific sensor and an RTL2832 dongle. Download rtl_433 at github.com/merbanan/rtl_433. It runs on a Raspberry Pi, maybe a Zero, so you can do this for well under $50.
$endgroup$
– Alan Corey
Jan 13 at 17:34
$begingroup$
For anybody that might want to do this, get an Oregon Scientific sensor and an RTL2832 dongle. Download rtl_433 at github.com/merbanan/rtl_433. It runs on a Raspberry Pi, maybe a Zero, so you can do this for well under $50.
$endgroup$
– Alan Corey
Jan 13 at 17:34
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3071608%2fsignificant-figures-in-averaging-samples%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Hi & welcome to MSE. I'm not an expert in statistics, so maybe somebody else can give you a more definitive response. I don't think there's any specific number of values to average to get an extra significant digit of accuracy. It depends a lot on how much variance there is in the data. If it's all very close to each other, then even several may work. However, if for example, there is a $1000$ times variation between the minimum and maximum values, with you also getting everything in between, it'll require many more values for you to have reasonable confidence in an extra digit of accuracy.
$endgroup$
– John Omielan
Jan 13 at 1:30
$begingroup$
No, nothing like 1000 times variance, this was more like a signal to noise ratio thing. Ideally the samples would have been the same but there were various error sources that only averaging could really hope to diminish.
$endgroup$
– Alan Corey
Jan 13 at 1:48
$begingroup$
I was just stating $1000$ to give emphasis to how important the variance can be. As for it being the "signal to noise ratio" thing (as well as any other error sources), the answer to your question will depend largely on the range & variance of the errors you may reasonably expect to encounter. This is something that obviously depends on your particular situation, such as the type of machine you're using, so there's something you should try to determine, or at least get a rough idea of them. Somebody here can likely give you a reasonable formula of some sort to use once you have these values.
$endgroup$
– John Omielan
Jan 13 at 1:53
$begingroup$
Welcome to Mathematics Stack Exchange community! The quick tour will help you get the most benefit from your time here.
$endgroup$
– dantopa
Jan 13 at 2:28
$begingroup$
Actually what got me started here is that I monitor outdoor temperatures with an Oregon Scientific thermometer but also record the values transmitted over radio on a computer. In 2.5 years I've gotten 1.5 million data points. If I say the mean temperature for a given month was x, how should I limit the precision in my printf? By default I see 6 places. It's about 1 data point per minute.
$endgroup$
– Alan Corey
Jan 13 at 13:49