This is a huge topic which cannot be answered in a single post, and which you are highly unlikely to get "right" the first time by yourself. But @NorthGuy has given you the correct starting point IMO.
The embedded product landscape is rife with devices which are locked down tight when they have no real need for security, alongside easily hackable devices with truly frightening potential. From working in the embedded-systems world, my observation is that every company believes their device needs airtight security, without really asking themselves why. As a consequence the products on the market with good security are merely those which came from competent development teams, i.e., little correlation with actual product needs.
I will now state a contrarian opinion which I think deserves greater consideration in the industry: If this is a consumer product, please remember that your users own it, not you. I see lists of "security best practices" bandied about as if more security is always a good thing. It is not. Although you do have certain responsibilities, ethically and legally, for failures or malicious use of your product after sale, it is not unreasonable for technically inclined users to expect to access (aka "hack") the device for their own purposes. Unless there is legitimate IP to protect (and often this is just an excuse to avoid critical thinking), IMO it is unethical to unnecessarily restrict access to the firmware. Additionally, the goodwill generated by permitting an open-source development community to form around your product can be significant; the WiFi router and digital camera markets both offer concrete examples. OTOH, if you are the "little guy" facing likely competition from cloners in China, that may be reason enough to lock down the firmware... although I am unsure how effective such efforts are in practice.