What you're making is basically an LCR meter. So you need to be able to generate a signal to send into the DUT, then be able to measure the current and voltage waveform that results. From there it's just math, which is fairly well documented, but the whole procedure gets harder and harder to execute as test frequency increases.
The actual hardware depends on what your requirements are. It's going to be much better to use an analog mux than to use a complete system on every channel unless you need synchronized simultaneous sampling. 10kHz shouldn't be too bad in terms of performance requirements, but a 350kHz capable LCR meter is generally a pretty expensive thing (even good benchtop ones can get real pricey beyond 100kHz). Measuring this accurately depends on fine detail simultaneous current and voltage measurements (or ones that are correlated to the same phase of the input waveform), and fine discrimination of the phase of the signal. You could think of the requirements as a signal generator, a high resolution scope with two channels, and a current probe.
I'd suggest reading up a bit on impedance and what factors into it as a starting point, then very clearly define how you need to make these measurements? Do they need to be time correlated? Do they need to be continuous? How much time can be spent for each measurement if switching between multiple inputs? Then it would be a deep-dive into LCR meter architecture, and taking a look at budget solutions and whether they could work at all for you.