Electronics > Beginners

sound analyzer for automating quality checks?

<< < (7/10) > >>

rhb:
Makes no difference at all.  The short time series smears the spectrum quite a lot.

I'm very pleased and impressed with the progress you've made.  This is pretty standard stuff, but it's not trivial and as you now know is a significant amount of work.  Your employer is lucky to have you.

As you are automating what is currently a manual procedure I'd like to suggest that you start out using a graph of the DUT spectrum with limit lines or a simple waterfall display on a monitor in portrait mode showing the spectra for each device.  Have a human make the send to rework decision.  Get some experience with that before implementing  SVD-KL or basis pursuit.

My reason for suggesting this is the graphical displays will make it easy for everyone to understand what you are doing when you automate the last step.

MasterT:

--- Quote from: rhb on January 16, 2019, 04:25:52 pm ---
--- Quote from: MasterT on January 16, 2019, 03:44:15 pm ---

--- Quote ---We will demonstrate FFT convolution with an example, an algorithm to locate
a predetermined pattern in an image.
--- End quote ---
http://www.dspguide.com/ch24/6.htm


--- End quote ---

I had some concerns when I read that, but a quick skim of the link showed that it was doing things properly.  That is the 1940's Norbert Wiener approach.  Still perfectly valid and useful, but not as powerful as a sparse L1 pursuit.  The reason being that it's L2 (least squared error).  But until recently L2 was all one could afford computationally and even that was often a strain on a VAX 11/780.  An L2 solution smears the result which L1 does not do. 

However, you can get close to an L1 using reweighted least squares or using singular value decomposition and truncating the eigenspectrum of a Karhunen-Loeve Transform (KLT).  The latter was my tool of choice for problems like this until 2013 when I learned of the work by Donoho and Candes.

What brought their work to my attention was when I realized that basis pursuit following the description in  Mallat's 3rd ed was doing things I *knew* based on many years of using the SVD-KLT approach were impossible.  As I had almost 30 years experience with SVD-KLT, that *really* got my attention.  SVD-KLT is very powerful in good hands.

--- End quote ---

After reading this vocabulary, I understand why "Mad Cow" disease outbreaks happened.
Lucky that mathematics was invented long before patent layers were born.

rhb:
It's far worse than you can imagine.  The same mathematics appears with a dozen names in the literature.  I've lost count of how many times I've spent several days investigating some "new" algorithm only to find that it was just someone reinventing the wheel.

I go to considerable trouble to try to restrict myself to what appears to be the mainstream lexicon.  But if you span lots of disciplines as I do, it eventually makes you crazy.  What I wrote is actually *very* generic.  But I feel obliged to define things like L0, L1 and L2 norms so that non-mathematicians have some idea of what I'm saying.  I hate it when someone uses a dozen acronyms without defining them.

If you think this was bad, look at  "A Mathematical Introduction to Compressive Sensing" by Foucart and Rauhut.  It was written for mathematicians and the jargon is incredibly opaque.  "A Wavelet Tour of Signal Processing" by Mallat is almost as bad.

I'm 65.  My undergraduate degree was in English literature.  My MS was in geology.  So the fact that I spent 3 years reading F&R twice and Mallat once plus about 1500 pages of original papers in mathematics rather boggles my mind.  But it was a lot of fun.  I just wish I could find someone else that I could talk to about it.

MasterT:
I always thought, that man should be proud of what he had done, or what he invented.
Not for what he  had read, and able to "parrot"-ed on a forum, where nobody would understand what he is talking about.
No offence mean, just non native eng. person with post brain wash traumatic disorder.

rhb:
Learning is the most important accomplishment in life.  I read 3000+ pages over 3 years and I can explain what it is, why it works and how to do it in simple English.  F&R proved so difficult  on first reading I had to read Mallat and then reread F&R.  F&R is 600 pages; Mallat is 800. But even after that it was still unclear.  It was not until I read the original papers by Candes, Donoho and their students and made a foray into the geometry of N dimensional space that I reached the point I understood.  I worked as a research scientist/programmer in the oil industry with some of the very best people in the field.

Nothing I have done compares with the difficulty of learning what I refer to as  "sparse L1 pursuits".

Application include:

compressive sensing (single pixel camera and MRI video)
passive radar (locate other airplanes using only the ambient RF field so that you do not reveal your presence)
matrix completion (the Netflix prize solution)
blind source separation (isolate  any speaker at a cocktail party with a few microphones randomly placed)
statistics and machine learning (far too many buzz words to list)
error correction (detect signals below the noise floor)
genetics (identify the alleles in DNA that cause an inherited trait)

There are more, but they involve really exotic mathematics and are primarily of interest to mathematicians rather than engineers and scientists.

My references to Wiener, SVD-KL, etc  were for the sake of tying to the formal education in mathematics which most engineers receive, albeit without sufficient exercise to fully comprehend until graduate school.  That is all basic DSP.   The attached figure is why sparse L1 pursuits are so different and important.  The explanation will mean nothing if you do not understand Shannon-Nyquist sampling theory.

Upper left figure is the Fourier spectrum of an arbitrary waveform.  Below it is the time domain waveform with 16 samples randomly selected from the 64 samples.  The plot is drawn with the usual sin(x)/x interpolation between points as done on a DSO.

At the top right is the result of attempting to recover the Fourier spectrum from 16 samples using an L2 norm inverse.  The bottom right is the result of doing the same thing except using an L1 norm.  In both cases Ax=y is being inverted to recover the Fourier coefficients in x.  The FFT solves the same problem via L2 under Shannon-Nyquist sampling constraints.  Shannon still applies in the L1 solution.  Zeros don't convey information.

That is the biggest advance in signal processing since Wiener's work in the 1940's.  All DSP that does not have "wavelet" attached to it is based on Wiener's work.  That's most of the DSP I've seen done in 37 years of doing it, mostly in major oil company research and technical services departments.

Navigation

[0] Message Index

[#] Next page

[*] Previous page

There was an error while thanking
Thanking...
Go to full version
Powered by SMFPacks Advanced Attachments Uploader Mod