Tag Archive for calibration

Dummy Load revisted

The dummy load returns and I do some cooking

I do very much enjoy watching youtube channels on electronics. It has probably been the single most instructive resource for me and led to real understanding of what I am actually doing.

Some time ago, I watched Dave Jone’s excellent episode on building a constant current sink – a “dummy load” which is an essential piece of test gear for testing out power supplies. So much less fiddly than messing with loosely spec’d power resistors, I had always had the intention of building one but never quite got around to it.

Read more

The Milliohm Meter gets a tune up

Inexactness always bothered me. In the electronics hobby it is to be expected, given the millions of tiny variables we assume don’t exist just make our calculations take less than a year and our designs to be reasonably constructible at home.

I’m referring specifically to my milliohm meter adaptor which I’ve been fiddling with (see here and here oh and here). Last time, I cursed the inexactness of my cheap Chinese multimeter, and not having a 2A range so I could be reasonably sure of getting its output as close as possible to 1A. In addition, the accuracy (as contrasted with the resolution) was bad enough to introduce a significant grey area to my calibration.

I have laying about a bunch of 1% resistors I bought to act as current sense resistors for various projects and I went about testing them. Given the schematic from last time, the readings tended to be off (and high) by about 2.5%. For the 1Ω 1% resistors this netted a reading of 1.022-1.025 on my multimeter. In the ball park but most certainly out of spec. Having ten of these resistors, and finding them all out by the same amount, it’s easy to come to the conclusion that my current source is off calibration, rather the resistors being out of spec.

Adopting a standard

Lacking any better equipment, I decided to take these 1Ω 1% resistors as my standard and set about fixing my current source so they read within 1% of 1Ω and call it a day (as that is as good as I am going to get without better test gear).

After a bunch of fidding, I decided to remove the 15Ω resistor out of the parallel arrangement to slightly drop the current output by about 50mA. I then adjusted the pot until one of the 1Ω resistors read exactly 1.000V (i.e. 1.000Ω). This puts it (theoretically) within 1% of reality.

I then tested the other nine resistors and trimmed the pot slightly so the distribution of values all hovered within 1% of 1Ω. This means a reading beween 0.990 and 1.010V on my multimeter. I found that I was able to do this and have the results repeatable. Job done, right?

Always some fly in the lube

Though I was able to get it to repeatedly read these ten resistors within 1% of their stated values, I did notice something different when I checked a bunch of 0.1Ω 1% resistors I also had laying about. Their readings tended to be high by as much as 30%! So, for a resistor that should show up as 0.100V on my multimeter (well, okay, between 0.099 and 0.101) I was getting values around 0.120 and 0.130. Frustrating!

I checked the current output and it was (in)exactly the same as when I was testing the 1Ω resistors so it doesn’t appear as if the current regulation of the LM317 is going into crazy non-linearity. I’m a bit stumped by this one. How could it get 1Ω so right and 0.1Ω so wrong?

I did pick up the 0.1Ω resistors from a junk place, so i’m not at all confident they are what they should be, still sort of unlikely. Other culprits could be that the thing is heating up and affecting it’s performance, or the construction lends itself to be in the “noise floor” and subject to all sorts of random perturbations.


Given that I do not own my much coveted Fluke 87-V yet which does have the accuracy and resolution for me to tune the current output, I really can’t go any farther with this one. I could try feeding it 100mA instead of 1A and seeing if the results are more linear (indeed, this would have been in line with the original specs of the schematic I adapted this from) and if they are to replace the set resistors with the appropriate values for 100mA. This does give me the annoying problem that the volts display on my multimeter would then read in 10s of ohms which I had hoped to avoid. I could then, conceivably, add in an opamp with a gain of 10 to amplify the result, which would then doubtless introduce more error into the design.

I will also have to check this heating issue and seeing if the drift is due to the LM317 overheating. An easy fix for that would be to give it a 5V supply (instead of 12+V) to drop the voltage differential that the LM317 has to deal with and lower it’s power dissipation. I may try that now actually.

Nope, just tried it and the same result. I fear this will have to be “good enough” for now. Testing a couple of 0.75Ω resistors looked to be okay, 1Ω is okay, 0.1Ω is quite a bit off. I might try the 100mA option but otherwise I’ll just finish assembly and call it a day. I wish I had some other 0.1Ω resistors to check…

A word of warning

My design (perhaps foolishly) dumps a constant 1A into anything you connect to it. The test resistors I used were all 2W+ power resistors and this circuit will definitely (proven by experiment) melt 1/4W carbon film resistors so please do not do that. Think of that math: a 1Ω resistor with one amp running through it will drop 1V. 1V at 1A = 1W or 4x the rating of the little 1/4W resistor. So do not use this on something that can’t take it. As always, I am not responsible if you set yourself on fire. The 100mA option is looking more attractive by the minute…

An update

Because I simply love beating dead horses, and really do not like nagging problems, I chose to investigate further. I modified the circuit to produce an output of 100mA or thereabouts thinking that perhaps asking the LM317 to dump an amp might be skewing the results a tad. This also allows me to use the higher resolution 200mA DC current range on my multimeter for a clearer picture. I bypassed the parallel resistor array and whacked in a 15Ω resistor. The results were… interesting:

Resistor Current Voltage measurement
83.3mA 86.6mV
0.1Ω 83.3mA 11.9mV

Note something interesting? Well I notice a few things. According to Ohm’s law, 100mA into 1Ω should net 100mV. Likewise, 100mA into 0.1Ω should net 10mV. Since the actual current delivered is 83.3mA, then the readings measured should be 83.3mV and 8.33mV respectively. The 1Ω resistor is pretty close at an actual reading of 86.6mV, off by 3.3mV or 3.96%. The 0.1Ω resistor however is reading 11.9mA when it should (ideally) read 8.33, a difference of 3.57mV or 42.9%! Even accounting for nominal irregularities, that’s a huge difference in accuracy.

What’s most interesting about this result is – the measured current of the resistor in both cases was the same. Making the variance in results even more surprising.

Now let’s go back to the 1A current output, and this time i’ll trim it to exactly (as I can) 1A.

Resistor Current Voltage measurement
1.00A 1.066V
0.1Ω 1.00A 0.155V

Now this is very telling. Again, using Ohm’s law, 1A should equal 1V in this case (making our math much less messier than before). The actual reading in both cases for the current is 1.00A (limitation of the 10A range on my multimeter). The 1Ω resistor again acquitted itself well by being off by 6.6mV or 6.6%. The 0.1Ω resistor however is having serious trouble reading 0.155V, or 55% over it’s expected value! What on earth is wrong here?

It’s obvious since part of the trouble in both cases is that the results are shifted in a positive direction. This could easily be attributable to losses and random craziness in the project design itself and the foibles of the components that make it up. Fortunately, we can trim that. If we take our 1Ω 1% resistor as standard (and why not, it’s proven by math to at least be close to its stated value), we can trim the current so it reads exactly 1.000V and compare. Here are the results:

Resistor Current Voltage measurement
0.94A 1.000V
0.1Ω 0.94A 0.143V

Now this is much better, and you can see why I used the 1Ω for calibration. The loss of 60mA (to God knows where) is sort of incidental. The important thing is each resistor is getting identical (or within half a bee’s dick) current. The 1Ω resistor shows 1.000V exactly as it should. Look, however, at the 0.1Ω resistor. It should be reading 0.100V (or within 1% of that) now that we have trimmed out any errors in the system. Instead, it’s showing 0.143V or 43% off of what it should.

Why exactly this is happening I am unable to determine. I do not have other 0.1Ω resistors to test but it could very well mean that the batch of 0.1Ω 1% resistors I received from the junk shop are horribly off tolerance. Or it could mean that my multimeter is not reading well into the mV range (even though it has a stated accuracy of 0.5%). Since it seems that the LM317 is dutifully dumping the exact same amount of current (be it 100mA or 1A) it would follow that the results would be reasonable regardless of which resistor I used.


It has to be good enough for now since I lack the equipment to test further. Either the 0.1Ω resistors are way off, or my multimeter is. Either is likely but hard to say which. The easiest test would be if I grab some other 0.1Ω 1% resistors that inspire more confidence. The best test would be for me to use a good, properly calibrated multimeter (when I can afford that). The stated 0.5% accuracy of my current heap of shit meter would make it allowable for the readings to be off by a millivolt, but not 43mV! This is of course assuming it was ever calibrated of which I am doubtful.

Though exactitude is again proving elusive, I can at least measure low value resistors more accurately, at least in the range of about 0.7Ω-10Ω which is a bonus compared to what I had before.

Multimeter accuracy and calibration confusion

As a bit of a follow up to my miliiohmeter project, I’m taking a step back to assess the standard by which I measure things. Having been schooled with a science background (Chemistry, Biology, Physics, Mathematics), the importance of good data, good results, good science is deeply ingrained in me. I believe this important in every walk of life, as an assist for critical thinking and to debunk the media’s annoying tendancy to throw meaningless statistics and skewed numbers at us to convince of whatever they want to convince us of. The tin-foil hat will however remain off tonight. I’ve had three beers after all.

Like most hobbysits, I accept my multimeter as not only the gold standard for everything I do with electronics, but it is also my eyes into what those pesky electrons are actually doing in there. Without it, the study of electronics would be horribly boring. We’d see lumps of circuitry that either did what it was supposed to, or failed in a puff of acrid smelling smoke – the reek of overload.

Recap of the Problem at Hand

In my milliohmeter project, I had reached an impasse. I created the thing to enable my multimeter to measure low resistences (>10Ω) down into the 100s of micro-ohms range since pretty much all DMMs without a dedicated function for this fail badly below 10Ω and especially below 1Ω. To make it work, I need a constant current source. I chose 1A as this made everything line up nicely. Ohm and his law states that 1V = 1A x 1Ω. My DMM, with this box in-between, would clearly read mV as mΩ. I put it together in my usual way of cobbling schematics, lots of fussing and reading.

It worked, after a fashion. It gave me a reading reasonably close to anything I measured with it. Perhaps a bit higher than it’s stated value and tolerance would suggest but close and certainly far better than my multimeter could do. It seems it’s about 2.2% out of where it should be. This could be a number of things or a combination of things. The set resistor that enables the LM317 to act as a constant 1A current source is actually a bunch of parallel resistors to dial into that sweet spot. CircuitLab told me this would be 1.155Ω, the datasheet for the LM317 told me it would be 1.25Ω. The actual measured value I got was approximately 1.245V drop across the parallel arrangement which is close to where it should be, or where I think it should be. I used standard 5% carbon film resistors to make this parallel arrangement with the addition of the critical 100Ω trimpot to calibrate away any oddities in that 5% tolerance of the resistors.

This is dandy, just build it up, trim it up and the things works right? Well sort of.

The datasheet says it should be 1.25Ω, which for 1A means a voltage drop of 1.25V, 50mV off isn’t bad, it’s a 5% error but how can I be sure that will net me 1A out of the thing? The LM317 has it’s limits too based on a variety of factors and that will need to be trimmed out in addition to the 5% resistor tolerance. Then there’s the other things, losses in the protoboard I’m using, loses in the leads, stray capacitance, quantum fluctuations – it never ends.

The only thing I needed to be sure of is that the thing is outputting 1A as close to exactly bang on as possible so that I could get an accurate reading. I needed to calibrate to that.

Unfortunately, as previously chronicled, my multimeter’s current ranges are quite limited to 10A, 200mA, 20mA, 2mA. For 1A I am forced to use the 10A range which gives me an output of 01.00A. Spot on yes, but lacking in that last digit to make sure it’s within the tolerance I need. I need to read 1.000A at least. 1.0000A would be even better! Given that my readings on a 1Ω 1% resistor was 1.022Ω that makes it 2.2% off-tolerance, and I’m pretty sure that’s not the resistor.

Electronics’ Dirty Secret

One of the first things I noticed about electronics when I began playing with it as a child is that the numbers never quite add up in reality. Every time I look at my mutlimeter when I take a measurement, I always shrug and say “close enough” and this can’t be helped. It’s frustrating when one’s math on paper makes nice round exact numbers yet the reality shows us we are just a little bit off. Part of this is due to the fact that we live in the real world and all the things we normally take for granted as not existing – like the resistance of conductors and PCB traces, as well as the noise they induce being antennas, tend to add up and creep into our measurements. Add to that the (in)tolerance of parts and the meter itself you have a mess.

As always, we, the scientists and experimenters, try to minimize this “noise” by buying bigger and better test equipment calibrated by some boffins in lab coats. This is all fine, if you have money. I don’t.

All I have is my Mastech MC8222H Chinese made $70 meter and that is the most accurate instrument I own. To me, this is my de facto gold standard as I simply have nothing better to compare it to.

The Mastech MC8222H is not a bad meter especially for it’s price. It has many annoyances I am not fond of and fluctuates like hell, but it works and has all the features I need for general electronics work on my humble hobbyist bench.

It is s 2000-count 4-digit display which would make this measurement easy if not for two things: it is lacking a 2A range on the current measurement. I can measure 200mA just fine, I can measure 10A just fine, but not in-between and keep that third decimal point. That’s not even the whole story. This is merely talking of it’s display resolution which says nothing of it’s accuracy.

A cursory look at it’s badly translated manual booklet tells me something else I’m not terribly fond of. Though the DCV function has a standard 0.5% accuracy, the DCA on the 10A range has an appalling ±2%+5 accuracy in the best possible case. For those that don’t know, the +5 figure means ±5 digits, meaning the least significant digit could be off by as much as 5 in addition to the ±2% accuracy window.


So here I am, with a bunch of adding intolerances. The resistors to set the constant current of the LM317 can be out by ±5%, the LM317 can be out itself by a bit, the meter I’m trying to calibrate it to can be off by ±2% and then some and that’s before we even take into account all the micro anomalies in terms of materials and construction. What is one to do?

The answer for right this moment is: nothing. I cannot calibrate this thing any better unless I have one good known bit of it I can say is calibrated to within half a bee’s dick of it’s life of where it should be. To me that’s >= 0.1%.


One option that is apparent is to get a 1Ω precision resistor meant for calibration. I did a quick poke about and was unable to find one but I’m sure they exist. With that, I could dial the current source down until it reads 1Ω and then know i’ll be getting the best possible measurements from it. Not counting the error my multimeter will inject of course just being it. At least it would eliminate a couple of error sources straight away.

The other option of course is buy a multimeter worth owning. A brand name, one that is respected. A company that actually calibrates their meters before shipping them out and are known for reliability. The obvious boon here apart from being calibrated are that there will be a much better accuracy on the unit in general. We’re talking at least 10x better on the DCV and at least 2x better on the DCA. If i make it a 10,000 count one, I will get my much coveted missing digit also. The less obvious boons will be a meter I can rely on for twenty years that won’t drift much and has features like auto-ranging that will annoy me far less than the Mastech. I’ll keep both of course, always need two meters at least.

The obvious brand contenders are: Fluke, Keysight (Agilent), BK Precision and a couple of others I will consider after like Extech. I haven’t a budget yet, but when I do I imagine it to be about $300.

With this, i can dial in that current in to my satisfaction. Ironically, I bet some of these more expensive meters come with a low-Ω function which completely negates the purpose of this project but hey – that’s why we do these things, to learn. As you can tell, I’ve learned a lot. Like don’t buy cheap meters.

Voltmeter success!

Now that I’m back to where all my tools are, I had a few minutes to pop down and perform the changes I have been documenting.

After messing about with a few alligator leads, I determined that not only does IN LO need to be connected to analog common, but the voltmeter ground as well. When IN LO wasn’t grounded, I got an initial reading (with nothing connected to the input) way off zero, beyond the point the trimpot could calibrate it.

As the forum posts mentioned in my previous articles stated – the dotted connections. must both be connected for the thing to work properly. This means REF LO is connected to ANALOG COMMON which is connected to IN LO and then GROUND. Given the language in the datasheet, I would have that it was an either/or scenario, not both. Regardless, I am pleased it’s working.

The assembly process was a bit messy, I cleaned up a lot of solder blobs and accidental solder bridges. Unfortunately, I soldered/desoldered and overheated a couple of pads, removing them from the board, the result works but it’s messy. If it can survive a few knocks and keep working, good enough for now. I can always build another one.

The divider resistor values calculated in my previous post worked a charm. Rather than wasting money and time grabbing 1% resistors, I tried various combinations of 5% ones until I got very close to those values. I tried a number of test voltages from batteries and my soon-to-be-replaced power supply and noted that not only was the reading linear across a range of voltages, but along it’s scale ranges as well which is exactly what I was looking for. After calibrating to 100mV and further trimming it a hair to get it in line with my multimeter, I am pleased to say it seems accurate to better than 1% which is not only good enough for it’s intended purpose, but better than I expected.

I have earned myself a beer tonight!