Electronics companies at the leading edge of performance are being forced to address board-level thermal requirements at the earliest stages of design. Printed circuit boards (PCBs) constitute the primary area where mechanical engineers can influence the thermal design at the conceptual design phase. So, the ability to accurately predict the thermal performance of the PCB early in the development process is becoming more critical than ever before.
A key limitation of tools designed to simulate the performance of PCBs early in the design process has been their inability to take into account the effects on thermal conductivity of localized concentrations of copper. But a recent improvement in these design tools gives designers the ability to model the effects of copper concentration on thermal conductivity in board-level thermal simulations.
The latest generation of PCB design tools makes it possible to model copper concentration at whatever level of detail is desired, even to the point of modeling each individual trace. This approach has been demonstrated to substantially improve the accuracy of upfront PCB thermal simulation. In turn, this greater accuracy can help improve time to market, and reduce engineering and manufacturing costs.
Increasing Importance of Board Design
In the fairly recent past, board-level thermal simulation was not considered to be a critical part of the mechanical design flow. Power-dissipation levels were low, and as long as temperature and airflow guidelines were met, thermal issues were usually relatively easy to resolve.
In this scenario, the task of the mechanical engineer was simple: ensure that the chassis housed each PCB within sufficient airflow at the right temperature. Thermal management was usually addressed at the time the chassis was designed, normally by adding fans and cooling vents.
The thermal design was typically based on a system-level simulation. The chassis designed by this method normally had a relatively long shelf life, typically three to five years. Once the thermal solution was fixed, it was expected to last through several generations of boards. The mechanical and PCB design processes were largely decoupled, enabling mechanical and PCB designers to work independently.
As Moore's Law marched on, mechanical and PCB designers found themselves having to interact more frequently. This was driven by the fact that PCB power dissipation for many designs was crossing an important threshold beyond which thermal compliance came into question. This threshold is linked to the ability of the PCB itself to act as a heatsink.
As a large flat surface, the PCB is very effective in transferring heat from components to the air. However, like any heatsink, a limit exists. A common rule of thumb to describe the limit comes from a simple heat-transfer calculation, which is an expression of the heatsinking ability of a PCB.
Given that 100°C is a common maximum temperature of components, the power that a PCB can dissipate is estimated to be:
Q = h × SA × (T1 - T2),
where Q is power dissipation expressed in watts (W), h is the heat-transfer coefficient expressed as W/(°C in2), SA is the surface area expressed as in2, T1 is the maximum component temperature in °C and T2 is the air temperature in °C.
The heat-transfer coefficient is largely dependant on the airflow speed, although there is no simple equation to describe the relationship between these two variables. The relationship between heat-transfer coefficient and airflow speed can vary depending on flow regime (laminar, transitional or turbulent) and the geometry of the heat-dissipating object (flat plate, cylinder, etc.). For a PCB sitting in airflow at 20°C and 200 linear ft/min, the maximum power dissipation is 1.8 W/in2.
Simulation Process Flow
Electronic manufacturers are beginning to address these problems by paying more attention to thermal design at the board level. Often, when designing a new board for an existing enclosure, electrical engineers are simulating the board alone to identify hot spots. Problems identified at this stage often can be addressed by layout changes that can be made nearly without cost at this stage of the process. Board-level simulation tools are usually much easier for electronic engineers to use, because they are designed around tools they already use, such as functional block diagrams and physical layouts.
In a typical board-level thermal simulation process flow, the systems architect will develop the initial concept design by creating a functional block diagram. The hardware design engineer then drives the first physical layout directly from the block diagram.
At an early stage in the design process, long before the mechanical engineer gets involved, the electrical engineer can use board-level simulation to evaluate the new board design in an existing system. A 3-D computational fluid dynamics (CFD) solver predicts airflow and temperature for both sides of the board.
Often the designer will identify hot spots, and thus, cooling management can be considered from the earliest stages of the design process. Changes made to the functional block diagram are reflected instantly in the physical layout and thermal representations.
Local Copper Concentration
As the importance of board-level design rises, the accuracy of the thermal simulation results becomes more critical. The thermal conductivity of the board itself has an important impact on simulation results in many designs. PCB thermal conductivity is particularly important in applications where conduction is the primary mechanism of thermal management. Local thermal conductivity can become critical in many applications because of the large difference between the thermal conductivity of copper and the dielectric material.
The traditional smeared approach determines how much copper is on each layer of the board and then calculates the average thermal conductivity for that layer by averaging the thermal properties of the copper and dielectric material. The problem with this approach is that it does not take into account the local thermal conductivities, which can have a major impact on the thermal performance of the board.
For example, consider the case of a board with a regulator that dissipates the vast majority of power. What matters most from a thermal performance standpoint is the thermal resistance of the primary conduction path between the component and the chassis. The most power-consuming components usually have a large number of traces running to them so the copper concentration and thermal conductivity in their vicinity is considerably higher than the average for the board.
The thermal resistance of the primary conduction path seen by this high-power-dissipation component is usually much lower than the average for the board. The result is that the traditional smeared approach typically shows junction temperatures higher than they actually are. Time and money may be wasted solving problems that either are less serious than they appear to be or do not exist at all.
Developers of PCB thermal simulation software are addressing this challenge by providing the ability to divide each layer of copper and dielectric material into arrays of patches of variable sizes with the thermal conductivity being separately defined for each patch.
The ability to determine the size of patches used to subdivide the layer is important because increasing the number of patches can substantially increase the amount of time required to simulate the thermal performance of the design. Users need the ability to trade off simulation accuracy against solution time.
Typically, in the early stages of the design process, designers will use larger patches to quickly evaluate a large number of design alternatives. Once they have identified a few promising designs, the patch size will typically be decreased to determine the thermal performance with a higher level of certainty. Users also have the ability to reduce the patch size in areas that are more critical to thermal management, such as the area surrounding high-power-dissipation components.
The following example shows how the ability to model local copper concentration can improve the accuracy of PCB design simulation. The example is based on a fairly simple board, with 30 components, that operates in a conduction-cooled environment. The heat is conducted through wedge-locks that provide a thermal and mechanical connection from the PCB to the chassis.
This system is designed for most of the heat to be conducted through the board into the chassis, although a fraction of the heat will naturally be conducted into the air. In this example, the majority of the power in the board is dissipated by a microprocessor with 80 leads and a land. The primary heat conduction path runs between the land and the leads. Although this example uses a microprocessor, the techniques employed to simulate the processor's thermal performance can be applied to any power transistor, IC, module or other power device.
In this example, we begin by exporting the board layout from the Allegro PCB design system into Flomerics' FLO/PCB software. A special menu called Flow EDA is installed in Allegro. We use this menu to gather information from the board, including the location, size and orientation of components, and layer information. Next, we export the bit-map images that Allegro generates for each layer (with the copper appearing as black and the dielectric material as white) into the simulation software. In this simulation, we assume a 45°C ambient temperature.
In the first simulation, we employ the smeared copper approach, which averages out the thermal conductivity over a broad area. Fig. 1 shows the local area of interest around the microprocessor, while Fig. 2 shows the results of this simulation.
With this approach, we divide the board into two local areas, one incorporating the processor and a band surrounding the processor (the highlighted area in Fig. 1), and another comprising the rest of the board. The PCB thermal simulation software then calculates the amount of copper and dielectric in each local area and an average value for thermal conductivity in each area.
The smeared approach assumes that the thermal conductivity of the entire area around the component is uniform and consists of this average value. This approach yields a junction temperature of 55.4°C above ambient.
Next, specify a 12 × 12 array in the area around the microprocessor. A dialog box appears with the image that has been captured. With that image you can use the slider bars to control the number of elements on each side. The simulation software then calculates the average thermal conductivity in each of the 144 patches defined by the array based on the images of the board traces provided by the PCB design software.
The array more realistically depicts the location of the copper traces and so more accurately models the flow of heat away from the processor as shown in Fig. 3 and Fig. 4. The array approach shows a reduction in the junction temperature of the processor to 50.2°C above ambient.
Finally you can model the array around the processor in its full detail. Assign the surface area covered by each trace with the thermal conductivity of copper and assign the thermal conductivity of the dielectric material to the rest of the area (Fig. 5). This detailed approach provides the highest level of accuracy that can be achieved in modeling heat conduction around the processor.
In this case, it shows an increase in junction temperature of 45°C above ambient (Fig. 6). Assuming that the result of this iteration provides perfect accuracy, the copper-patches approach provided an error of 6.76% while the traditional smeared copper approach provided an error of 17.82%.
The fact that junction temperature continually fell as the accuracy of the model increased can be explained by the fact that increasing the resolution of the model provided a more concentrated and efficient heat-conduction path. Accuracy naturally increases as the size of the patches is reduced.
In this application, the primary heat-conduction path is between the lead and lands, and there is a 0.6-mm gap between the lead and lands. As the size of the patches approaches 0.6 mm, the accuracy of the results can be expected to improve substantially.