85737 Ismaning, de
+1 (408) 432-1900 ext 2233
SoftSpan 16-/14-/12-Bit Current Output DACs Draw Less than 1uA Supply Current
The LTC2751-16 offers accurate DC specifications, including ±1LSB(max) INL and DNL over the -40°C to +85°C industrial temperature range. With its precision linearity and supply current less than 1uA, the LTC2751-16 can be used in DC precision positioning systems, high-resolution gain and offset adjustment applications, and portable instrumentation.
The LTC2751-16 also offers excellent AC specifications, including full-scale settling time of only 2us and low 2nV-s glitch impulse, which is key for AC applications such as waveform generation. Low glitch reduces the transient voltages between code changes in the DAC. Fast settling and low glitch reduce harmonic distortion, making it possible to produce higher frequency, lower noise output waveforms.
The LTC2751 DACs use a bidirectional input/output parallel interface that allows readback of any internal register, as well as the DAC output span setting. A power-on reset circuit returns the DAC output to 0V when power is first applied and a CLR pin asynchronously clears the DAC to 0V in any output range. The LTC2751 DACs are available today in pin-compatible 16-bit, 14-bit, and 12-bit QFN-38 (5mm x 7mm) packages.
Summary of Features: LTC2751-16/LTC2751-14/LTC2751-12
- Six Programmable Output Ranges:
--Unipolar 0V to +5V, 0V to +10V
--Bipolar ±5V, ±10V, ±2.5V, -2.5 to +7.5V
- Low 2uA(max) Supply Current
- ±1LSB INL, ±1LSB DNL Over Temperature
- Low 2nV-s Glitch Impulse
- Fast 2us Settling Time
- 2.7V to 5.5V Single Supply Operation
- Parallel Interface with Readback of All Registers
- Asychronous CLR pin Clears DAC Output to 0V in Any Output Range
- Power-On Reset Clears DAC Output to 0V
- 38-Pin 5mm x 7mm QFN Package
The use of information published here for personal information and editorial processing is generally free of charge. Please clarify any copyright issues with the stated publisher before further use. In the event of publication, please send a specimen copy to email@example.com.