Stars have a built in regulatory mechanism that stokes and damps their reactions: nuclear fusion of hydrogen into helium (and most of the other varieties of heavier fusion found in stars in their senescence) are critically dependent on temperature and pressure - particularly temperature. For the type of fusion that dominates in the Sun, an increase in temperature of 10% increases the fusion rate by 40%. For the type of fusion that dominates in somewhat heavier stars, an increase in temperature of 10% increases the fusion rate by almost a factor of six!
Now you might think that this is the scenario for a runaway reaction, but in fact an increase in temperature in the core of a star expands the core -- and nuclear reactions take place only in the very center of the star, in the region of peak temperature -- and this expansion lowers the temperature. Things settle down such that fusion occurs, but not too much of it. Essentially "just the right amount" occurs so that the star is supported pressure-wise. If the core were more "bottled up" then an explosion would in fact result, but the ability of the core to lose energy through convection to layers above is enough to stabilize the situation.
In a supernova, the central part of a star runs out of burnable fuel, precipitating a collapse. The outer layers, falling inwards, compress an inner shell that still contains hydrogen -- beginning nuclear reactions. But here the infalling material "puts the lid" on cooling and expansion and creates exactly the type of massive nucleosynthesis and energy release which worried you about normal stars.