Ray,
I'm not an EE, but it was my first choice for a major back in college before I discovered my advisor's EE curriculum involved summer classes and 18 hour semesters. My point is, this could be wrong, so I need someone who is an engineer to take a closer look.
If you amplified the signal on the line, the input power would stay the same. Here's what I came up with just running some quick numbers:
Total Power on a 30A breaker:
P = V * I
110VAC * 30A = 3300W
Resistance on the line (this is the one I'm not sure about, but the formula is right):
P = I^2 * R
R = P/I^2
3300W / 900A^2 = 3.667ohms
Keeping within Ohm's Law, resistance must be increased to increase power. Current remains the same.
To get 90000W, what does resistance have to increase to?
R = P/I^2
90000W / 900A^2 = 100 ohms
Total voltage:
V = I * R
30A * 100ohms = 3000V
I don't know what that article was talking about with a diode and capacitor configuration, but those numbers look like they make sense to me. What are the pitfalls to this?
Thanks