Sunday, February 15, 2015

On Measurements

This article aims to fill the insufficiency of last report in Measurements.

As what we have said in the last post, Measurements are inherently "uncertain". That is because measurements in contrast to numbers are not definite. There are three why's that contribute to this uncertainty. First is the "nature of quantity being measured". There are quantities whose measurements fluctuates depending on some factors like temperature. Second, the judgment of the experimenter. Is it 2 or 2.5? 2.5 or 2.75? It depends on the experimenter (the rightness of his judgment). And lastly, the limitation of the measuring device. There are some devices calibrated only to some extent while there are others that are accurate even up to tenths of decimal places. In every measurement there is always uncertainty. A good measurement is one which has an error report. An error or difference from the accepted value is called systematic error. This one has problems with accuracy. It might be because of problems within the system (wrong measurement procedure, biasness, and the likes). The second error is called random error. When you have this error, you lack precision. You may be accurate but your data is not close to each other. Therefore a good error count is one that considers systematic and random errors- one that considers both accuracy and precision.

On this part I will talk about the different Levels of Orders of Approximation. What are these? Basically, they lessen uncertainty in measurements. The first one is Order of Approximation. Quite straightforward. You just aproximate the measured value. The example given by our prof is the Fermi question, which asks about the atomic bomb explosion radius. Surely you would just approximate (you won't even measure it all!). For me this is just for practicality but not really for accuracy. Second is using Significant Figures. This is about estimated uncertainty that is 1/2 the device precision or least count. The number of significant figures in the measurement is equal to the number of digits you are certain. Third is the Limited measurements, which is just the MIN-MAX. What I wanted to focus on (that our prof also focused) is about the Distribution Function. 

It says that no matter how many data you have as long as it is normalized it will just approach a bell shape (central limit theorem). The more data, the better the graph will look like.
                                      C.G.Sevilla, What is the Length of a Rice Grain?, blogpost from http://thephysics101p1files.blogspot.com/2015/02/what-is-length-of-1-rice-grain.html?view=sidebar , accessed: 02/16/2015

No comments:

Post a Comment