Now my next unknown; how do I determine the necessary clock speed? I understand the basics, but am I missing anything? I need to determine the maximum theoretical processing load that my program will put on the hardware.
-I have an interrupt to count seconds (the widget has an internal clock and knows the time of day). This should be a VERY low processing load.
-I have an interrupt during music playback which updates output for each sample (8kHz rate). This will by far be the highest portion of the processing load.
-I have other sensor inputs which will be read every 15 seconds or so. Negligible processing load.
Considerations:
-As the widget will keep track of the time of day (and we don't want to accumulate drift); timekeeping is the highest priority. Even when playing sound, keeping time will take priority over updating the sound output on time.
-Sensor updates are simply updated, not logged. So if needed, I will turn off sensor updates while playing sound. Sensor updates can resume afterward with no functional consequence.
I know my clock speed, therefore I know my instruction frequency. I need to learn how to count the number of instructions in a function so I can know how long it takes to run that function.
How many instructions does it take to trigger an interrupt, update the PWM output, and leave the interrupt sequence? I read through my datasheet on timer interrupts, but it is just a little bit heavy for my level of experience.