Cap discharging into LED to what end? As a strobe?
The trouble with LEDs is, you don't gain much intensity at high current, so it's not worth trying to drive them more than a few times ratings. So, you might as well design the circuit for continuous operation, except you can use a too-large filter capacitor that averages over the pulses (with only a modest voltage drop) so the average supply current consumption remains fairly low.
Put another way: worry more about the LEDs (and other semiconductors) than the resistor(s). You don't usually gain much from using any semiconductor in pulsed operation, because the active part is relatively expensive, so it's already designed to be near or at the limits of useful operation. At high currents, LED efficiency drops off; BJT hFE falls, and Vce(sat) and switching times rise; MOSFETs are limited by Rds(on) and supply voltage; etc.
By and large, the only advantage that you get from pulsed operation is reduced thermal capacity. In other words, you can't save a thin cent on semiconductor ratings, but you can save on heatsinking.
The problem with using LEDs as strobes is, you want to design for a certain maximum pulse width, usually a few milliseconds, so that everything illuminated by the flash appears momentarily frozen. You don't want a long pulse (10s or 100s of ms), because that reduces the speed something needs to move at to appear to, well, move -- it breaks the illusion, and is less useful if this gets used for more than entertainment (e.g., automotive timing indicator, inspecting rotating machinery). So the pulse width is your hard limit.
To maximize brightness, you need to maximize power during the pulse. If you're limited by how much peak power you can deliver (like because it's not making more light, and damaging the LEDs, to go higher), then you can only make the pulse more square.
The reason traditional photoflash methods use capacitive discharge is, because they're very well suited to it. Once you turn a xenon tube into plasma, it tends to stay that way. It has no control electrode. So, if you can give it a push, then happily wait for it to run its course as it fucks shit up (typical peak current is in the 100s of amperes -- the sheer force of magnetic fields against wires is part of the 'pop' sound characteristic of a flash circuit!), you don't have to worry about timing and control circuitry, it just keeps going until the cap is empty.
But you don't get the same thing with LEDs, they simply conduct as current is applied. You can discharge a cap into them, all the same, but after the initial peak, it tails off very, very slowly. It's not like a plasma discharge nearly shorting it out and gulping down the power. The LEDs will never turn off as such, and will leave plenty of energy in the capacitor, since it's only going to discharge to a bit less than Vf.
So, like I said, you might as well use an overly large capacitor, at a nominal voltage, and drive it with a mostly constant current, and implement the timing externally -- heck, a 555 would be fine here.
As an aside...
What's scary about that is, you know how digital cameras have flash settings?...well, it's not like they're adjusting charge on the capacitor or something, that wouldn't be nearly accurate enough. Slow, too. No, they actually have an IGBT in series with the flash tube. They turn it off after a few microseconds to milliseconds, when the scene is bright enough. For short flashes (under a ms), the intensity should be a pretty flat and square pulse. Using it like it's an LED, but a whole hell of a lot more luminous! It's pretty crazy that they're able to get all that current (100A or less) through one tiny SMD chip (usually SO-8 size), with about 5V drop typical.
Tim