Thanks for an inspiring post! I had a few thoughts (beyond general appreciation):
Adding a second solar panel to generate more power doesn’t improve the financial “payback” time if the additional power isn’t used or stored. People with grid-tied solar systems and net metering can feed their extra power back into the grid in exchange for money, but this doesn’t work if you’re off-grid.
I think the electrical rate you’re using may be low, assuming you’re on PG&E. On the other hand, if you have no electric heating or cooling and you just run a laptop, phone, and desk lamp, maybe your usage really is low enough to stay in the bottom rate tier.
I don’t know enough electrical theory to understand or explain this fully — but there are 3 different kinds of electrical loads, capacative, resistive, and inductive. See the first answer at https://electronics.stackexchange.com/questions/91975/what-does-load-mean-and-what-are-the-different-types for a more detailed explanation. Loads with different characteristics will use your power differently. The general consensus seems to be that electric heaters use far too much power to be powered with solar sources. See http://solarhomestead.com/heating-your-off-grid-home/ for more discussion. If you are generating power at 100W/hr, and your space heater is running at 1500W/hr, it will take 2 days of winter sunlight to generate an hour’s worth of heating energy. Or you can size up your generation/storage system 15x just to run a space heater.
If you are already generating a lot more power than you’re using, you don’t need to care so much about efficiency — but if that ever changes, it would probably make sense to try to avoid DC to AC to DC conversions (for example, to charge a laptop, or for LED lighting) because each conversion loses step energy (you can tell because the inverter/power brick gets hot — that’s wasted energy).
