Lets do the math to see what minimum battery capacity you actually need. If you can get away with less current (and brightness), the battery capacity required will reduce proportionately. Also we are going to work in WH (Watt hours) rather than the more common AH (Ampere hours), as we need to know the total energy. All calculations rounded to three significant figures.
Assume the website is accurate and the LEDs each draw 20mA at 3.0V.
10 days * 24 H = 240 H (hours)
20 mA * 3.0 V = 60 mW
60 mW * 240 H * 25 LEDs = 360 WH
At 3V, that's:
360 WH / 3 V = 120 AH
For comparison:
An Energiser E95 Alkaline D cell is good for about 2000mAH. Over its usable life the voltage will drop from 1.5V to 0.8V averaging about 1.2V, so it can deliver 2 AH * 1.2V = 2.4 WH. That's 0.67% of the energy you need and I'm not even going to bother calculating how many D cells that is.
Another comparison:
A premium 18650 Li-Ion cell , the Panasonic NCR18650B is rated at 3400 mAH. Its average voltage over its entire usable discharge voltage range will be about 3.7V so it actually stores 3.4 AH * 3.7 V = 12.6 WH. That's 3.49% of the energy you need, so the absolute minimum will be 29 18650 cells.
Unfortunately a LED driver that runs from a bank of Lithium cells, wont be 100% efficient - 85% isn't unrealistic for a good one, and generic 18650 cells wont have the high capacity of the Panasonic one. Good quality ones are typically in the range 2000mAH to 2600mAH. Fakes may be much lower. Assuming 2200mAH cells and 85% ( 0.85 ) efficiency in the LED driver, each gives you 2.2 AH * 3.7 V =8.14 WH, result * 0.85 = 6.92 WH effective energy. That's 1.92% of the energy requires so you'll need 52 cells.
A further fly in the ointment is Lithium cells behaviour at end of discharge - if you discharge them below 3.0V you will damage them and they may be unrecoverable + attempting to recharge deeply discharged Lithium cells can be hazerdous, so if you use to simple a controller that doesn't have over-discharge protection, any delay getting to them at the end of the event and you'll probably have to scrap them.
That scrapyard 100AH car battery I suggested earlier probably has about 60AH capacity left (make sure it load tests OK before purchase). If you don't want to kill it stone dead, you cant use more than 40AH of that. The average voltage over the full discharge range of a Lead Acid battery is approximately 12.3V.
40AH * 12.3V =492 WH, result * 0.85 = 418 WH effective energy, which is 137% of the energy you need, so there's margin to spare.
Even if you decide that a measly 2mA is bright enough, you'll need six 18650 cells to be reasonably certain they'll run the LEDs long enough.
For brightness testing, you could do a lot worse than putting a 100 Ohm resistor and a 1Kiloohm potentiometer (aka 'pot', wired as a rheostat), in series with ONE LED and powering the resulting assembly from 5V. You can use a USB charger wall wart for the 5V if you don't have a variable lab grade power supply. If the wall wart output is over 5.2V, add a Silicon diode in series to drop the voltage a bit.
Turn the pot to maximum resistance and the LED will get slightly under 2mA. Turn it to minimum and it will get about 20mA. To measure the LED current, measure the voltage across the 100 ohm resistor. 1V is 10mA, so simply multiply the reading in volts by 10. (If its under a volt and the reading is in mV, divide the number by 100).
Its also a good idea to measure the voltage across the LED when you are happy with the brightness.
Please also measure the voltage across the LED as close as you can get to 2mA (200mV across the 100 Ohm resistor) and 20ma (2V across the 100 ohm resistor), If you cant get all the way up to 20mA, and you added a diode earlier, turn back the pot, remove the diode and try again, taking care not to go over 20mA.
Post the final readings here and we can help you find a cost-effective battery system that will do the job. The 2mA and 20mA LED voltage measurements are to help us design or recommend a suitable driver circuit.