Yes, an LED driver would be the best thing to use (especially the PWM type), but the battery supply at 1s is very close to the rating of the LED and I'm not sure if any driver or simple resistor would work well. The voltage drop of LEDs is effectively constant (actually, it's a function of drive current, but we will assume it's constant for the sake of argument). It's the
current you really want to control here. The best and most reliable thing to do would be to use a test setup and measure the voltage drop of the LED at the current you want to drive it at. This will tell you the real value for this specific LED at the current you qant to drive it at, and you can properly design the circuit accordingly.
Using a resistor can be troublesome at these low voltages. Let's say for argument that he wants to use a resistor. The lipo nominal is 3.7v, and using the LED Vf and If values of 3.2v @ 2.8A, respectively. The resistor needed would be (v_supply-Vf)-If, or 0.18 ohms. But, at the peak battery voltage of 4.2v, the current would instead be 5.5A (too high), and the battery voltage of 3.2v at LVC might not even light the LED at all, or very dim. There just isn't enough voltage "headroom" to use a resistor effectively. And a constant-curent transistor setup also needs more voltage headroom as well.
Using multiple LEDs in parallel can be done using a lower supply voltage, but the current is additive, and can get high, especially with the high power types. 20mA X 5 is only 100mA, but 5x2.8A in parallel is 14A.

This is why most reliable circuits use the series arrangement for multiple LEDs. I know you only have one LED here, but just thought I'd throw that in.
The best way to power any LED (especially multiples) is to use them in series, use a supply voltage around 3-5v higher than the combined volteage drops, and use a constant current supply to create a consistent current no matter if the supply voltage fluctuates (as a battery will). Since the LEDs are in series, the total current draw will be the same 2.8A for all of them, you are just using higher voltage. Kinda the same idea as in BL where we use higher voltage for more efficiency.
So, what does this all mean? Well, it means that making a proper driver for that single LED will be difficult using a 1s battery pack. It can be done, but the LED brightness will fluctuate as the battery voltage goes from fully charged to LVC. So, I would design your circuit with both max AND min battery voltage in mind to make sure you aren't exceeding the LED ratings while still providing some output. You're gonna find that a resistor (or driver) set for proper current at max battery voltage (4.2v) will result in a dimmer LEDs at anything lower.
I created a page at my calc site to help with this stuff:
http://scriptasylum.com/rc_speed/_led.html. It currently won't let you use 2.8A as an LED value and too low of a supply voltage for safety reasons since it was designed for normal LEDs, but you can still see what I'm talking about. There are some notes at the bottom which you will want to read, and don't forget to read the little help question marks. It pretty much goes over all the stuff I said above.
You can use any voltage regulator IC as a constant current evice; it;s just a matter of configuration. But getting ones that are rated for over 1-1.5A are going to need to be ordered special. However, I've used the LM317T (1A 1.25v adjustable voltage reg) along with a pass transistor as a constant current device for up to 10A before. This regulator works the best as a constant current device because the min voltage headroom needed is just around 2.5v higher than the total LED Vf values.
And as far as heat goes; you WILL need to heatsink the LED. 2.8A @ 3.2v is close to 9W. Failure to do so will result in a very short LED life! And, depending on what driver circuit you use, that may need to be heatsinked as well.