So what is the functional relationship between X and Y for voltage converters.
It seems to be self-defeating, as increasing the current coming from the power lines is normally going to drop the line voltage even more.
No, it's just basic physics.
Short answer:
Your circuit needs a certain amount of power to run. Let's say it needs 10 watts. That's 2 amps at 5V, 1 amps at 10V, or 0.5 amps at 20V. If the voltage goes down, then the current has to go up to compensate.
More detailed answer:
Most electronic devices require a constant voltage power supply to operate correctly. Depending on the circuit this might be 12V, 5V, 3.3V, 400V, whatever. But fundamentally most circuits need clean, reliable, constant voltage input. If they need more power, then they draw more current, but the voltage needs to stay the same. That requires a
voltage regulator. You'll find voltage regulators in 99% of all electronic equipment that you use.
The simple way to regulate the voltage is to put some sort of resistor in series, and adjust it to maintain a constant voltage as conditions change. Usually this "resistor" is actually a transistor, and there's a feedback circuit to maintain the right voltage. Current in equals current out. Very simple. The problem is that this regulator is very wasteful. If 12V comes in and you only need 6V, you have to burn off half the power in your regulator. This kind of simple regulator tends to get very toasty hot, because although it is simple it is quite inefficient.
A better technique is to use a switching regulator. It can convert between two voltage levels without burning lots of energy. To do this it chops the incoming power with a switch (a transistor of course, but wired up to always switch all-on or all-off very hard) and then uses a coil or transformer to "boost" or "buck" that chopped power to the right voltage (some converters can boost, some can buck, some can do both). Because it's being driven by a switch, all-off or all-on, it doesn't burn off power -- the converter is much more efficient. No switching regulator is 100% efficient, but they are often in the high 90's.
So you need an output of, say, 10V at 1 amp. That is 10 watts. Ignoring the few percent efficiency loss, it needs 10 watts in. If the input voltage is 10 volts, it draws 1 amp = 10 watts. If the input voltage is 20V it draws 0.5 amps = 10 watts. As you can see, as the input voltage goes up, the current goes down. And as the voltage goes down, the current has to go up. It needs that 10 watts. So if your line voltage in starts to droop, it has
no choice but to either draw more current, or switch off altogether (which a good design will do automatically if the voltage droops too much - just like the Roadster does).