BOOK DETAILS

CompTIA Strata Study Guide Authorized Courseware: Exams FC0-U41, FC0-U11, and FC0-U21

CompTIA Strata Study Guide Authorized Courseware: Exams FC0-U41, FC0-U11, and FC0-U21

by Andrew Smith

ISBN: 9780470977422

Publisher Sybex

Published in Computers & Internet/Computer Science

Are you an AUTHOR? Click here to include your books on BookDaily.com

Sample Chapter


Chapter One

Processing and Memory

COMPTIA FUNDAMENTALS OF TECHNOLOGY (UK) IT FUNDAMENTALS (USA)

OBJECTIVES COVERED IN THIS CHAPTER:

  •   1.1 Identify basic IT vocabulary

    * Processor speed/cores

    * RAM

    Every computer, whether it's a high-speed supercomputer at a research facility or a hand-held game for children, has a processor—the component that performs math calculations. Every computer also has some type of random access memory (RAM), which is used for temporary data storage as data moves into and out of the processor. In this chapter, you'll review the basics of processors and RAM. Having this information will help you evaluate these components in the computers you buy, use, and maintain.

    Processors

    Every computing device has a central processing unit (CPU), more commonly known as a processor (or microprocessor). Processors are a part of mobile phones, gaming consoles, digital music players, and everything from automobiles to washing machines. Some computers even have multiple processors that share the computing load for faster performance.

    Processors are integrated circuits containing millions of electronic components called transistors. Transistors are an important component in any electronic device, including a computer. Transistors are electrical gates that let power through or don't depending on their current state. They're the basis of binary processing—that is, processing based on things being in one of two states: on or off, 1 or 0.

    At its most basic level, a processor's job is to do math. It accepts numbers as input, performs calculations on them, and delivers other numbers as output. It's mostly oblivious to the significance of those numbers; it just runs the instructions it has been given. It's a common misconception that the processor is the brain of the computer—it's not nearly as sophisticated as a human brain. It just does what it's told, like a hand-held calculator, but at incredibly high speed. The brains of the operation would more accurately be the operating system (OS), which feeds the numbers to the processor and uses the results.

    Different processors have different instruction sets—that is, different math calculations they can perform. The advanced processors in personal computers (PCs) have very large and complex instruction sets; the simple processors in items like appliances have fewer instruction sets. The OS must be written with the processor's instruction set in mind so it can send the right codes to activate the desired instructions.

    Processors also vary in the number of bits they can accept and process as input. A bit is a binary digit, either 0 or 1. The more bits a processor can accept simultaneously, the faster it can work through the backlog of data to be processed. The earliest PCs had 8-bit or 16-bit processors; today, PCs have 32-bit or 64-bit processors. The higher the number of bits, the larger the word size the processor can accept. Word size refers to the amount of data that can simultaneously enter the processor in one operation.

    Processors work only with binary digits, so the OS translates all numbers to the binary numbering system in order to send them for processing. Humans most commonly use the decimal numbering system (digits 0 through 9). Computers typically use hexadecimal numbering (16 digits, 0 through 9 plus A through F) or binary numbering (2 digits, 0 and 1). Binary is used when interacting with the processor, and hexadecimal is used when the OS refers to memory addresses. Exercise 1.1 demonstrates how to convert between binary and other numbering systems.

    Processor Brands

    A variety of manufacturers produce processors that serve inside appliances and other noncomputer devices. However, for personal computer processors, market competition has virtually eliminated all but two manufacturers, Intel and AMD.

    NOTE

    Apple Macintosh computers used Motorola processors for many years but switched to Intel processors in mid-2006.

    Intel Processors

    Intel is the most popular brand of processors for personal computers. Since the company's founding in 1968, it has manufactured more than 50 different processors, plus graphics cards, motherboards, and various computer peripheral devices.

    The original IBM PC and PC-XT computers ran on Intel processors (the 8088 and 8086, respectively). Later IBM models also used Intel processors, such as the PS/2 that used the 16-bit 80286 processor. In 1993, Intel released the original Pentium processor, the first commercially viable 32-bit CPU. It was followed by the Pentium 2 in 1997, the Pentium 3 in 1999, and the Pentium 4 in 2000.

    Intel's current offerings include a 64-bit line called Intel Core, which includes the Core i3, Core i5, and Core i7. The Core i3 is an entry-level CPU, the Core i5 is midrange, and the Core i7 is a high-end product. As you might expect from the word core in the name, each of these is a multicore processor (containing from two to six cores, depending on the model, each of which functions as a processor in its own right).

    Within Intel's Core lineup are many different versions of the i3, i5, and i7, for both desktop and laptop use, each with a different codename and combination of number of cores, L3 cache size, and socket type. For example, the brand name Core i3-5xx (where each x represents a digit), codenamed Clarkdale, has two cores and a 4 MB L3 cache and fits in an LGA 1156 socket. Some of the other codenames include Arrandale, Lynnfield, Gulftown and, the most recent of these, Sandy Bridge.

    AMD Processors

    Advanced Micro Devices (AMD) has been the strongest competitor to Intel in the PC processor market since the 1990s. AMD processors have historically been less expensive than Intel's, while offering roughly equivalent performance. Because AMD processors use a different instruction set than Intel's, they require motherboards specially designed for them. The motherboard is the large circuit board inside a PC to which everything else connects.

    AMD's early CPU offerings included the K5, K6, K7, K8, and K9 processors, each of which competed directly with an Intel processor. AMD's current lineup includes the Phenom II, Athlon II, and Turion II lines.

    Processor Sockets

    Processors require the correct type of socket in the motherboard in which you're installing them. When buying a motherboard, make sure the processor socket is appropriate for the CPU you plan to use.

    Sockets use various code names and numbering. The numbering is often based on the number of pins, or contacts, on the chip. For example, the LGA 1356 contains 1,356 pins. In earlier times, a pin grid array (PGA) was the preferred style of processor socket. It consisted of a grid of tiny holes into which the tiny pins on the back of a CPU chip were secured. Figure 1.1 shows a PGA socket in a motherboard circa 2005.

    However, in recent years, this design has been replaced by land grid array (LGA), a style that has no pins on the chip. Figure 1.2 shows an example of an LGA socket. In place of the pins are tiny pads of bare gold-plated copper that touch corresponding pins on the socket. The main advantage of LGA is size; the pads can be much smaller than a socketed pin, so the CPU can contain more connection lines to the motherboard without the size of the CPU socket becoming very large.

    For mobile computers, where the processors and motherboards are very small, a different type of socket is used. It's either a very small PGA socket (such as the PGA-989) or a ball grid array (BGA) socket, which is an updated type of PGA in which pins are replaced by balls of solder.

    Processor Speed and Performance

    Just as an automobile has unique performance features that single it out from the crowd, computer processors have specialized performance features that distinguish each model. In this section, you'll learn about some of the ways that processor performance is measured and enhanced.

    We measure the speed of a processor in hertz (Hz), or cycles per second. Each time the internal clock of the processor completes a full cycle, a single Hz has passed. Modern processors operate at millions (megahertz, MHz) or billions (gigahertz, GHz) of cycles per second. The original IBM PC, released in 1981, operated at 4.77 MHz. Processor speeds have exponentially increased because then because of new technology advances, with some of the fastest processors today running at more than 3 GHz.

    In Table 1.1, you can see a relative comparison of some single-processor clock speeds. Notice that the dates end at 2008. That's not a misprint; it's just that dual-core and quadcore processors have replaced single-processor models in recent years. You'll learn about those processors later in this chapter.

    Don't confuse the number of cycles per second with instructions per second (IPS), the number of instructions that a processor can complete per second. Many instructions carried out by processors take multiple internal clock cycles to execute. Many current systems compute in millions of instructions per second (MIPS).

    A processor's speed is related to its number of instructions per second, but other factors besides its speed can affect the number of instructions it can process per second. For example, a multicore CPU can process more instructions than a single-core, and other technology enhancements such as hyperthreading and efficient cache usage further enhance a processor's capability. As a point of reference, the Intel Core i7 Extreme Edition (quad core) runs at 3.3 GHz and processes 147,600 MIPS.

    Another major contributor to the performance of any processor is the quality and speed of the motherboard that supports the processor. The motherboard's system bus (also called the front-side bus) is the pathway that delivers data to the processor. The speed at which this pathway operates is determined by the motherboard's front-side bus speed. The system timer on the motherboard determines this speed. Most motherboards can automatically adjust the speed of the timer based on the installed processor; on older models, it was necessary to set jumpers on the motherboard manually to indicate what processor type and speed was installed.

    NOTE

    The word bus as it's used in computers comes from Latin; the original word was a shorthand version of omnibus or to "transport all." Here a bus can literally transport all data.

    Processor Cache

    Sometimes the processor sits idle through one or more cycles because the motherboard hasn't delivered any data to it to be processed. That happens for a variety of reasons. One of them is that some data has to travel all the way from the memory to the processor and, even though the motherboard's front-side bus is fast, it's not fast enough to keep up with the processor in delivering that data. Although system memory is probably less than 50 millimeters away from the processor, this could be 1,000 miles in terms of processor performance.

    Therefore, to increase the instructions processed per second, computer manufacturers have looked for ways to keep frequently used data closer at hand to the processor, so it has to travel a shorter distance when it's called. That's the purpose of a cache. A cache is a temporary storage area located very near the processor and connected to it by an extremely high-speed pathway. It holds recently used data so the data doesn't have to be re-retrieved from RAM every time.

    Modern processors include multiple cache levels. The Level 1 (L1) cache is the smallest, and it's on the processor die itself. In other words, it's an integrated part of the manufacturing pattern that's used to stamp the processor pathways into the silicon chip. You can't get any closer to the processor than that. The Level 2 (L2) cache is larger and almost as close—it's in the processor chip, but it's not usually part of the die (see Figure 1.3). Some processors also use a Level 3 (L3) cache. When the processor needs some data, it first checks the L1 cache and retrieves the data from there if possible. If not, the processor checks the L2 cache and then the L3 cache. If the data is none of those places, the processor retrieves it from RAM.

    The original Pentium processor had 16 KB of L1 cache. (The L2 cache was on the motherboard, so the amount varied depending on the board.) In contrast, today's Intel Core i7 processors can have up to 12MB of cache. (A megabyte, abbreviated MB, is approximately one million bytes, or eight bits. A bit is a binary digit, either 0 or 1. You'll learn more about these measurements later in the chapter.) When processor manufacturers list cache sizes in their marketing materials, they typically list only the largest cache (that is, the highest numbered one). On multicore systems, the largest cache is shared among all the cores.

    Multicore Processors

    To keep making better and faster processors every year, manufacturers constantly have to find ways to increase the number of instructions per second. One way they have done this is to increase the complexity of the processor design. Another is to house multiple processors together and have them work in unison.

    For many years, motherboards have been available that hold multiple, separate processors, each with its own slot on the motherboard. This technology is limited, though, because of the distance between the processors and the speed limitations on the bus pathways between them. As mentioned earlier, manufacturers found a solution to this by creating multicore processors that combine several processors into a single package. These processors look like one chip from the outside but actually have several separate processor cores inside.

    NOTE

    Motherboards are still available that hold more than one processor, especially boards designed for servers. By combining multiple multicore processors in a system, you can increase the processing power many times over. For example, if you have a single-core processor running at 3 GHz, a quad-core processor could elicit the combined potential of 12 GHz, and a board with four quad cores could theoretically have a processing potential equivalent of 48 GHz. Be advised, however, that in the transaction of commands between multiple processors, there is an inevitable loss in performance.

    Cooling Fans and Heat Sinks

    With millions of transistors working in close proximity, processors generate heat. This is a result of one of the simpler laws of physics involving electrical energy. Heat is caused as the transistor restricts the flow of electrical current. The energy has to go somewhere, so it's released as heat. Heat generation is highest when a processor is busy, such as when it's drawing a complex graphic or juggling many applications at once.

    Without external cooling, a processor would overheat and shut down almost immediately after the system started up. Therefore, all modern computers have some type of heat-displacement system. A cooling system can be either passive or active. A passive cooling system contains no moving parts; an active cooling system does something active to circulate either water or air to displace the heat.

    The most common type of passive cooling system is a heat sink. A heat sink is a block of heat-conductive metal that touches the processor and draws the heat away from it. Some small, less powerful processors, such as the one in a cell phone or hand-held gaming device, can stay cool enough to function with only a passive heat sink. Figure 1.4 shows a passive heat sink that might be used to cool one of the nonprocessor chips on a motherboard, such as the main chip in the chipset that controls motherboard operations.

    In today's PCs, passive cooling usually isn't enough for the processor to stay cool; a heat sink is combined with a fan that blows across it, further helping to dissipate the heat as it's drawn off. Figure 1.5 shows an example of an active heat sink.

    Active cooling systems require power to operate, increasing the amount of power required by the PC. Some computers have energy-efficient cooling fans that can be adjusted in the BIOS setup to operate only if the processor's temperature (as evaluated by an internal thermometer) is over a certain value.

    (Continues...)

    Excerpted from "CompTIA Strata Study Guide Authorized Courseware: Exams FC0-U41, FC0-U11, and FC0-U21" by Andrew Smith. Copyright © 0 by Andrew Smith. Excerpted by permission. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher. Excerpts are provided solely for the personal use of visitors to this web site.
  • Thanks for reading!

    Join BookDaily now and receive featured titles to sample for free by email.
    Reading a book excerpt is the best way to evaluate it before you spend your time or money.

    Just enter your email address and password below to get started:

      
      

    Your email address is safe with us. Privacy policy
    By clicking ”Get Started“ you agree to the Terms of Use. All fields are required

    Instant Bonus: Get immediate access to a daily updated listing of free ebooks from Amazon when you confirm your account!

    Author Profile

    Amazon Reviews

    TOP FIVE TITLES