Yes and no.
Most people think about power supplies in terms of volts. But with LEDs it's much easier to talk about current.
LEDs have a variable resistance. If you apply more voltage to them, their resistance goes down. Most power supplies are "voltage" supplies, EG a battery. But it's not really a CONSTANT voltage supply, there can be a big difference between full charge and dead.
Typically LEDs are driven from a CURRENT supply, IE a circuit designed to deliver a constant current rather than a constant voltage. The cheap ones just use a resistor in series with a voltage source, which wastes power and doesn't compensate for changes in the voltage source. Good circuits measures the current flowing through the LEDs, and adjust the current output to compensate, like a closed loop control system. Low efficiency circuits do this with a transistor, which acts like a variable resistor. The best, most efficient circuits do this using pulse-width-modulation.
Light output increases as current increases, up to a point. Beyond that, the LED gets hotter, and less efficient, and light output starts to decrease, and the LED die may be damaged. The precise nature of the curve depends on the LED and how it's heat-sinked.