It's safe to say that Li-ion and Li-Pol cells (Li-xx) batteries affect the performance and form factor of portable electronic devices. Without them, it would be hard to imagine notebooks that run as long, cell phones that feel as light, and PDAs that look as small as the ones we see today. The reason is simple. As shown in the table, on page 30, Li-xx cells provide the highest energy density — in terms of weight and volume — of any available rechargeable chemistry. This characteristic makes Li-xx cells particularly attractive for applications where size and weight are critical.
Li-xx batteries also provide other benefits. They have low self-discharge rate, exhibit no “memory” effect, and have a nearly 100% charge efficiency. But like most things in life, these benefits come with a price tag. Li-xx cells require a special charging method, are highly susceptible to electrical stress conditions, and do not do well under extreme thermal conditions. However, most OEMs have found ways to balance the advantages and disadvantages of these cells by using sound design techniques and taking advantage of available semiconductor devices.
Li-xx cells must adhere to the manufacturer's recommended charge method to maximize a battery's safety, cycle life, and capacity. The recommended charge procedure involves several stages. During the initial stage of charge, cells must be qualified and possibly conditioned. Manufacturers do not recommend charging, if the battery voltage or temperature is outside its specified limits. For safety, manufacturers suggest suspending any charging above 45°C and below 10°C until the battery reaches the normal operating temperature range. For a deeply discharged battery (typically 3V per cell or less), apply a low-level conditioning current prior to full charge.
After the initial qualification and preconditioning, Li-xx batteries are charged with a current of 1C or less, until the battery reaches its charge voltage limit. Then, a constant voltage of 4.1V or 4.2V charges the battery. Constant-voltage limits vary, depending on the cell manufacturer and the anode material (coke or graphite). To maximize safety and the available capacity, you must regulate the charge voltage to ±1% accuracy. During the constant-voltage phase of charge, the charging current drawn by the battery tapers and the charger terminates the charge once the current level falls below 10% to 15% of the initial charging current.
The choice of power conversion method to regulate charge current or voltage depends on various factors, such as the number of cells, cell capacity, and the application. Most single-cell applications use a linear regulation method or current-limited power supply. Linear regulation offers several advantages: low component cost, design simplicity, and “quiet” operation due to the absence of high frequency switching. Linear topology also introduces some power dissipation in the system — in this case, mostly during the current regulation phase of the charge cycle. This is a drawback if the designer has no means to manage the thermal issues in the design. Fig. 1 shows a linear charger.
Single-cell portable devices also use the current-limited power supply method in Fig. 2. In most designs, an inexpensive external wall-transformer powers the charger. Then, regulation of the wall-transformer's unregulated dc output provides a constant charge current or voltage for the battery. In applications where the designer needs to keep the system power dissipation, size and cost to a minimum, the alternative solution is to take advantage of the relatively high output resistance of these wall-transformers. In this topology, the transformer acts as a limited current-source, eliminating the need to regulate the system charge current.
Applications that need higher charge current or involve cells in series or parallel typically use a switchmode topology for charge regulation. The advantages of this topology are the high-efficiency operation and lower cost at higher charging currents.
Safety Is the Key
Due to the volatile nature of lithium, it's important to protect the cells against thermal and electrical stress during their operation. Nearly all Li-xx packs contain embedded electronics to protect the cells and the end-user against abnormal conditions. The following are the four key protection requirements:
- Overvoltage — This is the most critical threshold for safety. An overvoltage condition may lead to catastrophic failure. As a result, a major concern of designers is that the charger fails to regulate correctly, causing the voltage to rise well above its recommended maximum. This applies to both the OEM and aftermarket charger products.
- Undervoltage — Undervoltage, or overdischarge, is a concern mainly for longevity of the battery pack capacity rather than catastrophic failure. A deeply discharged Li-xx battery adversely affects the cell's ability to regain that energy through charging.
- Overcurrent — Overcurrent protection is of most concern in user hands or pockets. It may lead to permanent capacity loss or even catastrophic failure. Set this threshold to a workable limit to allow system input capacitors to charge and connectors to settle and accept other parasitics — but do not allow permanent short circuits to rapidly discharge the battery.
- Overtemperature — Sense temperature to protect from repeated short-circuit failures and possible catastrophic failure.
A typical battery protection solution consists of a controller IC that detects and acts upon the different battery conditions, two pass MOSFETs, and a thermal fuse. The controller IC monitors the cell voltage for over and undervoltage conditions. MOSFETs limit the charge or discharge potentials. As the battery state changes, the controller acts to avoid any unsafe conditions by switching off the MOSFETs. Use a sense resistor or the voltage drop across the MOSFETs to sense an overcurrent condition. To avoid a false trigger, the controller provides a delayed response time or uses a multiple sampling technique. Finally, use a Positive Temperature Coefficient thermistor (PTC) for thermal protection (Fig. 3, on page 27).
Imagine owning and driving a car that does not have a fuel gauge. No matter what make or model of car you own, not knowing how much gas is remaining in the tank could have troublesome consequences. The situation is even worse if there is a fuel gauge that is not accurate; the same holds true for the new generation of handheld electronics. Relatively simple cellular phones of yesteryear are now audio players, PDAs, pagers, Internet browsers and more. Improvement in feature content and complexity of these devices has had a dramatic impact on the battery and battery management. The simple and inaccurate voltage-based capacity monitoring schemes of the past are no longer acceptable. Users demand to know the “real” available capacity. They need to know how many more audio files can be downloaded, how many more stock trades can be placed and how many more calls can be made before the battery finally runs out. An inaccurate capacity gauge is both annoying and frustrating to the end-user.
A voltage-based capacity-monitoring scheme lacks the required accuracy for today's handheld applications because lithium batteries are relatively flat in charge and discharge directions. In fact, during a typical charge cycle, more than 70% of the charge time occurs during a “constant voltage” mode. This mode is responsible for replenishing more than 40% of the charge capacity. Similarly, on a discharge cycle, most of the capacity is available in a relatively narrow voltage range, from 3.7 down to 3V. Fig. 4 is an example of discharge curves for a common 18650-type Li-ion cell.
To make matters worse, the available capacity also changes as a function of self-discharge, cell aging, temperature, and discharge rate or profile. The latter is particularly important in wireless applications. For instance, the discharge profile of a GSM handset can involve more than a 1A amplitude pulse at about 25% duty cycle. During the standby mode it is less than 10mA.
You can accomplish accurate capacity monitoring by coulomb counting, or measuring the charge input to and subsequently removed from the battery. Fig. 5, on page 28, provides an example of such a solution. The device in this example measures the voltage drop across a low-value series sense resistor between the negative terminal of the battery and the battery pack ground contact. A low-offset internal voltage-to-frequency converter converts this voltage into charge and discharge counts. By using the accumulated counts in the charge, discharge and self-discharge registers, an intelligent host controller can determine battery state-of-charge information. Other key features of this solution include single-wire digital interface to the system, accurate cell voltage and temperature monitoring, and on-chip nonvolatile flash memory for retaining critical information when the cell is deeply discharged or removed from the system.
For more information on this article, CIRCLE 332 on Reader Service Card