| Electronics > Projects, Designs, and Technical Stuff |
| Curve fitting question |
| << < (3/8) > >> |
| Kasper:
I would get more data before curve fitting. If possible / feasible, test at different temperautres, battery levels, or whatever might affect your errors. Also do some tests with Vin increasing and some with it decreasing, if possible. It's hard to imagine an error pattern looking like that every time. If it varies on other tests, look for a pattern, if you see one, make an equation or a look up table. If there is too much difference between each test, then clean up your test set or your unit under test. I'm sure there are better solutions but I've found excel works well for curve fitting. Sometimes I use the polynomials excel recommends and other times I make an equation and graph it then tune the equation until the graph matches the curve I'm trying to fit. |
| jbb:
If it’s embedded, +1 for lookup table and piecewise linear interpolation. However, that curve is weird. Might I suggest you examine your equipment to check for unexpected conditions? Some colleagues of mine once had a hell of a time with a wonky rotary encoder. It produced very very strange output. Eventually they realised that the grub screw holding the encoder disk onto the shaft wasn’t done up. |
| voltsandjolts:
Data looks strange, if it's not repeatable then it's not even worth trying to curve fit to it. |
| rs20:
A Lagrange polynomial is a HORRIBLE idea in this context because the resulting polynomial will take a circuitous path to travel through every point precisely, no matter how ridiculous the resulting values between the given points are: The ONLY time to use Lagrange polynomials is if you absolutely must have a polynomial curve that goes precisely through all your points, and absolutely literally couldn't care less what the rest of the values are. In general I'd say linear interpolation is the best you're going to manage, except: --- Quote from: voltsandjolts on March 07, 2019, 08:29:21 am ---Data looks strange, if it's not repeatable then it's not even worth trying to curve fit to it. --- End quote --- +1 to this. The recorded data wildly changes direction from one sample to the next. If the wild changes are due to noise, you shouldn't try to fit to the noise. If the wild changes are a real, actual signal, then you need to make more finely spaced measurements to establish the true shape of the curve (and avoid Aliasing). |
| tggzzz:
--- Quote from: Jester on March 06, 2019, 12:34:03 pm ---I'm working on the calibration aspect of a project and would like to correct for some non linearity by applying a correction factor. The uncorrected data and graph can be seen here (correction only at zero and full-scale): I tried plugging the data into an online 3rd order Polynomial Regression tool and it helps but is far from ideal. We covered curve fitting in school decades ago, I don't recall much at this stage, except that a polynomial fit is likely a poor choice, perhaps Cubic spline or some other fit method? Also can you suggest an online tool that will accept preferably 10-15 data pairs. Thanks --- End quote --- The "right" curve to use can only be defined after you know the theory of how the measurement is expected to vary. In the absence of that understanding, plotting a curve is like a Rorschach test :) Examples: * if you are plotting mortality against time, then you ought to have a bathtub curve :) * if you are plotting noise, then you shouldn't use a curve, but should use mean and standard deviation :) * if you are plotting a time series where there is an expected temperature dependence, then you should use the measured temperature as the curve * if you are plotting a warm-up transient, perhaps an offset exponential curve is relevant |
| Navigation |
| Message Index |
| Next page |
| Previous page |