The Milliohm Meter gets a tune up

Inexactness always bothered me. In the electronics hobby it is to be expected, given the millions of tiny variables we assume don’t exist just make our calculations take less than a year and our designs to be reasonably constructible at home.

I’m referring specifically to my milliohm meter adaptor which I’ve been fiddling with (see here and here oh and here). Last time, I cursed the inexactness of my cheap Chinese multimeter, and not having a 2A range so I could be reasonably sure of getting its output as close as possible to 1A. In addition, the accuracy (as contrasted with the resolution) was bad enough to introduce a significant grey area to my calibration.

I have laying about a bunch of 1% resistors I bought to act as current sense resistors for various projects and I went about testing them. Given the schematic from last time, the readings tended to be off (and high) by about 2.5%. For the 1Ω 1% resistors this netted a reading of 1.022-1.025 on my multimeter. In the ball park but most certainly out of spec. Having ten of these resistors, and finding them all out by the same amount, it’s easy to come to the conclusion that my current source is off calibration, rather the resistors being out of spec.

Adopting a standard

Lacking any better equipment, I decided to take these 1Ω 1% resistors as my standard and set about fixing my current source so they read within 1% of 1Ω and call it a day (as that is as good as I am going to get without better test gear).

After a bunch of fidding, I decided to remove the 15Ω resistor out of the parallel arrangement to slightly drop the current output by about 50mA. I then adjusted the pot until one of the 1Ω resistors read exactly 1.000V (i.e. 1.000Ω). This puts it (theoretically) within 1% of reality.

I then tested the other nine resistors and trimmed the pot slightly so the distribution of values all hovered within 1% of 1Ω. This means a reading beween 0.990 and 1.010V on my multimeter. I found that I was able to do this and have the results repeatable. Job done, right?

Always some fly in the lube

Though I was able to get it to repeatedly read these ten resistors within 1% of their stated values, I did notice something different when I checked a bunch of 0.1Ω 1% resistors I also had laying about. Their readings tended to be high by as much as 30%! So, for a resistor that should show up as 0.100V on my multimeter (well, okay, between 0.099 and 0.101) I was getting values around 0.120 and 0.130. Frustrating!

I checked the current output and it was (in)exactly the same as when I was testing the 1Ω resistors so it doesn’t appear as if the current regulation of the LM317 is going into crazy non-linearity. I’m a bit stumped by this one. How could it get 1Ω so right and 0.1Ω so wrong?

I did pick up the 0.1Ω resistors from a junk place, so i’m not at all confident they are what they should be, still sort of unlikely. Other culprits could be that the thing is heating up and affecting it’s performance, or the construction lends itself to be in the “noise floor” and subject to all sorts of random perturbations.

Acceptance

Given that I do not own my much coveted Fluke 87-V yet which does have the accuracy and resolution for me to tune the current output, I really can’t go any farther with this one. I could try feeding it 100mA instead of 1A and seeing if the results are more linear (indeed, this would have been in line with the original specs of the schematic I adapted this from) and if they are to replace the set resistors with the appropriate values for 100mA. This does give me the annoying problem that the volts display on my multimeter would then read in 10s of ohms which I had hoped to avoid. I could then, conceivably, add in an opamp with a gain of 10 to amplify the result, which would then doubtless introduce more error into the design.

I will also have to check this heating issue and seeing if the drift is due to the LM317 overheating. An easy fix for that would be to give it a 5V supply (instead of 12+V) to drop the voltage differential that the LM317 has to deal with and lower it’s power dissipation. I may try that now actually.

Nope, just tried it and the same result. I fear this will have to be “good enough” for now. Testing a couple of 0.75Ω resistors looked to be okay, 1Ω is okay, 0.1Ω is quite a bit off. I might try the 100mA option but otherwise I’ll just finish assembly and call it a day. I wish I had some other 0.1Ω resistors to check…

A word of warning

My design (perhaps foolishly) dumps a constant 1A into anything you connect to it. The test resistors I used were all 2W+ power resistors and this circuit will definitely (proven by experiment) melt 1/4W carbon film resistors so please do not do that. Think of that math: a 1Ω resistor with one amp running through it will drop 1V. 1V at 1A = 1W or 4x the rating of the little 1/4W resistor. So do not use this on something that can’t take it. As always, I am not responsible if you set yourself on fire. The 100mA option is looking more attractive by the minute…

An update

Because I simply love beating dead horses, and really do not like nagging problems, I chose to investigate further. I modified the circuit to produce an output of 100mA or thereabouts thinking that perhaps asking the LM317 to dump an amp might be skewing the results a tad. This also allows me to use the higher resolution 200mA DC current range on my multimeter for a clearer picture. I bypassed the parallel resistor array and whacked in a 15Ω resistor. The results were… interesting:

Resistor Current Voltage measurement
83.3mA 86.6mV
0.1Ω 83.3mA 11.9mV

Note something interesting? Well I notice a few things. According to Ohm’s law, 100mA into 1Ω should net 100mV. Likewise, 100mA into 0.1Ω should net 10mV. Since the actual current delivered is 83.3mA, then the readings measured should be 83.3mV and 8.33mV respectively. The 1Ω resistor is pretty close at an actual reading of 86.6mV, off by 3.3mV or 3.96%. The 0.1Ω resistor however is reading 11.9mA when it should (ideally) read 8.33, a difference of 3.57mV or 42.9%! Even accounting for nominal irregularities, that’s a huge difference in accuracy.

What’s most interesting about this result is – the measured current of the resistor in both cases was the same. Making the variance in results even more surprising.

Now let’s go back to the 1A current output, and this time i’ll trim it to exactly (as I can) 1A.

Resistor Current Voltage measurement
1.00A 1.066V
0.1Ω 1.00A 0.155V

Now this is very telling. Again, using Ohm’s law, 1A should equal 1V in this case (making our math much less messier than before). The actual reading in both cases for the current is 1.00A (limitation of the 10A range on my multimeter). The 1Ω resistor again acquitted itself well by being off by 6.6mV or 6.6%. The 0.1Ω resistor however is having serious trouble reading 0.155V, or 55% over it’s expected value! What on earth is wrong here?

It’s obvious since part of the trouble in both cases is that the results are shifted in a positive direction. This could easily be attributable to losses and random craziness in the project design itself and the foibles of the components that make it up. Fortunately, we can trim that. If we take our 1Ω 1% resistor as standard (and why not, it’s proven by math to at least be close to its stated value), we can trim the current so it reads exactly 1.000V and compare. Here are the results:

Resistor Current Voltage measurement
0.94A 1.000V
0.1Ω 0.94A 0.143V

Now this is much better, and you can see why I used the 1Ω for calibration. The loss of 60mA (to God knows where) is sort of incidental. The important thing is each resistor is getting identical (or within half a bee’s dick) current. The 1Ω resistor shows 1.000V exactly as it should. Look, however, at the 0.1Ω resistor. It should be reading 0.100V (or within 1% of that) now that we have trimmed out any errors in the system. Instead, it’s showing 0.143V or 43% off of what it should.

Why exactly this is happening I am unable to determine. I do not have other 0.1Ω resistors to test but it could very well mean that the batch of 0.1Ω 1% resistors I received from the junk shop are horribly off tolerance. Or it could mean that my multimeter is not reading well into the mV range (even though it has a stated accuracy of 0.5%). Since it seems that the LM317 is dutifully dumping the exact same amount of current (be it 100mA or 1A) it would follow that the results would be reasonable regardless of which resistor I used.

Conclusions

It has to be good enough for now since I lack the equipment to test further. Either the 0.1Ω resistors are way off, or my multimeter is. Either is likely but hard to say which. The easiest test would be if I grab some other 0.1Ω 1% resistors that inspire more confidence. The best test would be for me to use a good, properly calibrated multimeter (when I can afford that). The stated 0.5% accuracy of my current heap of shit meter would make it allowable for the readings to be off by a millivolt, but not 43mV! This is of course assuming it was ever calibrated of which I am doubtful.

Though exactitude is again proving elusive, I can at least measure low value resistors more accurately, at least in the range of about 0.7Ω-10Ω which is a bonus compared to what I had before.

Comments are closed.