I've been using this particular smart weight scale for the past year: [vont.com]. I find it more efficient and convenient than the normal scales we have been using. Of course, I test every three months to make sure the scale works perfectly. I think the smart ones are better since they have a lot of useful functions and motivate you to keep your body measurements as good as possible, but at the same time, they have a very affordable price.
Whichever, it is best to calibrate and/or test your scale for yourself every now and then, by using known weights.
As however you seem to be talking about bathroom scales, it is hard to see why great accuracy would be needed, things like the body mass index are only approximate recommendations, and human weight will vary through the day by several pounds, according to many factors such as how hydrated you are, most scales should be more than accurate enough therefore. A better way to ensure greater accuracy if wanted, (Though Why ?) would be to weigh yourself several times during the day and average the results.
Bunch of stupid comments here. I was a certified scale tech for twenty two years.
There is NO comparison. Digital scales are more accurate. Springs wear and get weak. Levers get corroded. Load cells do not malfunction unless they are mistreated. Accuracy and load depends on the load cell type. The load cell is a analog device, then decoded by the Analog to digital converter chip. It doesn't matter what the bit rate is.
The accuracy is based on the load cells design.
There is no comparison.
You get a digital scale and if you don't like what it says you weigh, move it two inches to either side. You'll get a different reading
That's a cheap scale. Not an good example. When testing and calibration, weights are placed on the corners to test the cell. So, you have a cheap bathroom scale.
I like my digital scale more than my analog scale. I am always 15 lbs lighter on my digital scale. It may not be working properly, but since we have a president that lives in an alternate reality, I choose to believe my digital scale, and I accuse my analog scale of being part if the fake scale media.
Lol I can totally understand!
Analogue and digital are simply two different display systems. The accuracy, precision, repeatability of the instrument depends mostly (not 100%) on the transducer that converts the delta message into a signal. Transducers that measure such physical characteristics such a weight, pressure, velocity, length, temperature are almost invariably analogue devices - pitot tubes, manometers, Bourdon tubes, thermocouples, strain gauges, &c. I cannot think of an exception, at least not in my world of fluid dynamics.
I don't see how there could be an exception. We take an analogue signal and convert it into digital anyway.
@PondartIncbendog Nor do I, but I leave the door open in case someone knows something that I don't.
@Arouet I've been working on antique scales and I was a registered scale tech for twenty years. I'm also an electronic tech. Retired.
Weight scale? A digital scale always has rounding errors... based on decimal places. Am analog scale can be read to infinite decimal places...but the human eye can't read it that well, and rarely are they calibrated well. So... choose your poison.
Rounding errors? It has programmable rounding factors if set up properly.
@PondartIncbendog no, he knows what he is talking about. This has nothing to do with how scales work and everything to do with the way that the electronics work. An analog scale is going to count on a smoth line, whereas a digital scale will count through a series of steps, and it goes to the next number when the voltage gets high enough to cross a threshold, because that is what digital means, everything is reduced to a real number.
For example, if you have pi grams of pie, and you place it on a digital scale with perfect accuracy that counts each digit in hundreds, then it will only weigh 3.14 grams.
If you place it on an analog scale with perfect accuracy it will weigh pi grams, but the scale may only have a precision of hundreds, in which case the scale reads 3.14 grams.
It worries me that kids today think that an analog clock is some mythical device that only the ancients can understand and worship now...I remember someone younger telling me that 120 seconds was longer than 2 minutes when they microwaved something....oh please...
The question is too generic. It would depend on the built in accuracy of the individual instrument.
Exactly. You can't compare a scientific digital scale to a twenty dollar bathroom scale.
I have been in the recording biz off and on since the late 70's till well, I still do it.
To be honest, both have their merits, as well as pitfalls.
Although, the art of digital has come leaps and bounds since the early 80's, sample rates, frequency rates, and many other Bits, (Pardon the pun). Digital recording has made that art the most easilly accessable format ever. The problem is that with digital, there is only the limit of programming, and with so many over using tools like compression.
I do find it funny when young "audiophiles" for lack of better word will spend 3k on a turntable thinking that that is the "truest" sound. But in reality vinyl has some major issues as well.
To be succinct, I love the digital tools, but Miss the art of real time flaws that really make a recording human. If that makes any sense.
I don’t think OP was talking about musical scales.
That depends on a lot of things. Assuming they have both been calibrated, and are placed in identical ideal conditions, whats going to determine this is the way that the scales work.
So, an analog scale will move a lever a certain distance based on the weight, which will be offset either by a spring or the mechanical tension in the lever. This means that the scale will be most accurate in the mid range, and less accurate at maximum and minimum weight.
For the digital scale, the inaccuracy is going to be a function of the fact that computers can't count right because that's how binary works. So to quickly explain, each time the last number reaches 1, the number doubles in value, the same way when we get to 10 in arabic numbers the first number indicates the number of ten's.
0 = 0, 1 = 1, 10 = 2, 11 = 3, 100 = 4, 101 = 5, 110 = 6, 111 = 7, 1000 = 8, 1001 = 9, 1010 = 10.
Fractions are stored in the computer as x/10 + y/100 + z/1000 ... so if you do a simple math problem with floating points: 1.2 - 1.0 you get 0.199999999999999996
This problem is unavoidable based on the way that modern machines function, so this inaccuracy will be fundamental to any digital scale.
All that being said, it's going to come down to calibration anyways, and either could be more accurate than the other. I'm not going to recommend you buy calibrated weights to find out, but that's your choice I suppose.
I think even a 16 bit floating point has more than enough precision for it to not be a concern. Your also assuming that floating points are even being used.
@indirect76 Well, this makes me want to buy a bunch of scales and tear them apart to see what's inside. If it's 16 bit, and is precise to the thousandth's it should definitely not be an issue. If it's only 8 bit, then a scale measuring in grams should have a maximum weight of 10 kg before it loses precision. Above that weight it would only be precise to the hundredth's.
I suppose the scale could be set up to function as an integer of the minimum readable value, ( 0.001 = 1, 0.010 = 10, 0.011 = 11) That would get around the issue with floating points
This also ignores another problem I didn't think of: Digital scales measure in steps, meaning that it's precision is half the value of it's readability. A weight on a scale that reads in grams which is greater than 1.5 thing but less than 2.5 thing will read as 2 g. This means that there is going to be a fundamental error in the way that it counts ( literally for the same reason floating points give bad math )
[concretecountertopinstitute.com]
@Happy_Killbot Ah, nothing you stated is correct. I have been restoring analogue scales for twenty years. I was also a certified digital scale tech for twenty two years.
There are spring and levers in spring analogue scales. Springs stretch. Levers get gummed up and wear. All those "errors" are programmable.
If you think digital scales are inaccurate, why does the whole world and the scientific world use them?
@PondartIncbendog I would like to point out the fact that you had a job restoring scales as evidence that all scales drift and can lose accuracy over time. Even in the scientific world, scales need to be calibrated regularly to maintain accuracy.
All I'm saying is that calibration is more important in determining accuracy than type of scale, and all scales have some fundamental inaccuracies due to the way they work. That's why when you calibrate things, there is a tolerance associated with it.
@Happy_Killbot I did NOT have a job "restoring" scales. I was a certified and bonded digital scale tech.
More opinions without merit.
@PondartIncbendog Above you said: "I have been restoring analogue scales for twenty years" You are the one who said that you restored scales, not me.
I used to be in the military and one of my collateral duties was to calibrate gauges, thermometers, transducers, and pressure switches. Some were mechanical, some were digital. If some of this equipment breaks, we have to know how to fix it. A transducer is basically the same thing as a load cell, the same function but slightly different construction. The voltage output by the transducer is an analog signal, which is then converted to a digital signal before being displayed on the digital readout. Because a digital signal only consists of specific numbers, there is a limit to the accuracy of the device.
Also just for the record, I never said digital scales are more or less accurate than analog scales, obviously that's going to vary from scale to scale, I'm sure we can agree on that.
It depends on the quality of the device. Cheap, on both fronts aren't good.
No such thing as a high quality analog scale.