I.
Introduction
A.
Analog Signal
B.
Digital Signal
II.
Mixed Signal Design A.
B.
Data Converters i.
Nyquist-Rate Converters
ii.
Sigma Delta Modulators
iii.
Flash A/D
Switched Capacitor Circuits i.
Resistor Equivalent
ii.
Stray Insensitive Integrators
iii.
Biquad Design
iv.
Leapfrog Design
C.
Layout Considerations (kung me mkta)
III.
Difficulties of Mixed-Signal Systems
IV.
Solutions V.
Conclusion
I. Introduction Analog Signals An analog signal is any variable signal continuous in both time and amplitude. It differs from a digital signal in that small fluctuations in the signal are meaningful. Analog is usually thought of in an electrical context, however mechanical, pneumatic, hydraulic, and other systems may also convey analog signals. An analog signal uses some property of the medium to convey the signal's information. For example, an aneroid barometer uses rotary position as the signal to convey pressure information. Electrically, the property most commonly used is voltage followed closely by frequency, current, and charge. Any information may be conveyed by an analog signal, often such a signal is a measured response to changes in physical phenomena, such as sound, light, temperature, position, or pressure, and is achieved using a transducer.
Digital Signal Digital signals are digital representations of discrete-time signals, which are often derived from analog signals. A discrete-time signal is a sampled version of an analog signal: the value of the datum is noted at fixed intervals (for example, every microsecond) rather than continuously. If individual time values of the discrete-time signal, instead of being measured precisely (which would require an infinite number of digits), are approximated to a certain precision—which, therefore, only requires a specific number of digits—then the resultant data stream is termed a digital signal. The process of approximating the precise value within a fixed number of digits, or bits, is called quantization.
De La Salle University-Manila College of Engineering Electronics and Communications Engineering
Research Paper Title :
Mixed Signal Design
Course code/ Section :
VLSILEC
Lab Instructor
:
Engr. Roderick Yap
Group Number
:
1
Submitted By
:
Ching, Warren Austero, Kris Santos, Marvin So, Jonathan
Mixed-Signal Design Mixed-signal ICs continue to develop. Continual improvements in process technology and circuit techniques are motivation designers to adopt 0.25 micron technology and finer CMOS design rules dictates that it will even go to even smaller geometries (Bindra, 2002).
Data Converters Since the real world is analog, we need a device that can convert analog signals into digital signals and vice versa. An Analog to Digital converter is an electronic circuit which accepts an analog input signal and produces a corresponding digital number at the output. A Digital to Analog converter is an electronic circuit which accepts a digital number as its input and produces a corresponding analog signal at the output (Pett, 2002). Listed below are some of the commonly used A/D and D/A converters.
Figure : A/D D/A converter
ADC Successive Approximation This circuit comes from another ADC which is called Digital Ramp converter. The successive approximation was made to solve the problem of the Digital Ramp ADC’s shortcomings. In the Digital Ramp, the SAR in the circuit was only a counter that counts up in binary sequence. The SAR counts by trying all values of bits starting with the MSB and finishing at the LSB. . Throughout the count process, the register monitors the comparator's output to see if the binary count is less than or greater than the analog signal input, adjusting the bit values accordingly. This counting strategy is much faster compared to the counting technique in Digital Ramp.
Figure: Successive Approximation ADC
Flash Also called as the parallel ADC. It is formed of a series of comparators, each one comparing the input signal to a unique reference voltage. The comparator outputs connect to the inputs of a priority encoder circuit, which then produces a binary output.
Figure : Flash ADC Delta-Sigma One of the most advanced ADC circuits is the Delta-Sigma ADC. Delta, in Greek means difference or change while Sigma, meaning summation in mathematical terms. Sometimes it also referred to Sigma-Delta ADC. The principle of the sigma-delta
architecture is to make rough evaluations of the signal, to measure the error, integrate it and then compensate for that error. The mean output value is then equal to the mean input value if the integral of the error is finite (Wikipedia).
Figure: Delta-Sigma ADC DAC Simplified Binary Weighted Resistor This circuit is a variation of the inverting summer op-amp circuit. This inverting op-amp uses negative feedback for controlled gain and has several inputs and one output. The output is the negative value of the sum of all inputs. The number of inputs represents the number of bits to be converted.
Figure: Binary Weighted Resistor DAC R/2R Ladder This circuit is an alternative to the Binary-Weighted Resistor and it is much used. This uses a fewer unique resistors which took away the disadvantage of the previous circuit. By constructing a different kind of resistor network on the input of our summing circuit, we can achieve the same kind of binary weighting with only two kinds of resistor values, and with only a modest increase in resistor count.
Figure: R/2R Ladder DAC
The Switched Capacitor
If S2 closes with S1 open, then S1 closes with switch S2 open, a charge (q is transferred from v2 to v1 with . If this switching process is repeated N times in a time (t, the amount of charge transferred per unit time is given by
. Recognizing that the left hand side represents charge per unit time, or current, and the the number of cycles per unit time is the switching frequency (or clock frequency, fCLK) we can rewrite the equation as . Rearranging we get
, which states that the switched capacitor is equivalent to a resistor. The value of this resistor decreases with increasing switching frequency or increasing capacitance, as either will increase the amount of charge transfered from v2 to v1 in a given time.
Stray Insensitive Integrator
Biquad Circuit & Block Diagrams
Layout Considerations Development of System Architects
The primary barrier to widespread adoption of a top-down design style for complex mixed-signal circuits is a lack of engineers with the skills and training to be system architects. A system architect must • Be fluent in an AMS language and skilled in the art of modeling • Be an experienced designer • Have a good understanding of the top-down design process
• Be proficient in the use of circuit and AMS simulation • Have the ability to lead and manage complex projects Given the high pressure world that most designers live in, it is unlikely that they will be able to acquire such a broad and deep set of skills while on the job, even if they are motivated to do so. Rather, it is important for their employers to look for engineers that have the interest and the relevant background and invest the time and training to develop them into system architects. In addition, it is essential that appropriate training becomes available from universities and continuing education centers.
Top Down Design of Mixed Signal Circuits With mixed-signal designs becoming more complex and time-to-market windows shrinking, designers cannot hope to keep up unless they change the way they design. They must adopt a more formal process for design and verification: topdown design. It involves more than simply a cursory design of the circuit block diagram before designing the blocks. Rather, it also involves developing and following a formal verification plan and an incremental and methodical approach for transforming the design from a abstract block diagram to a detailed transistor- level implementation.
At the Design Automation Conference in 1998, Ron Collett of Collett International presented findings from a 1997 productivity study in which his firm analyzed 21 chip designs from 14 leading semiconductor firms. The study revealed a productivity gap of
14´ between the most and least productive design teams. The study also revealed that developing analog and mixed-signal circuitry requires three to seven time more effort per transistor than designing digital control logic, though this factor was normalized out of the 14´ ratio. In my experience, the primary culprits behind the poor productivity of those at the bottom of the scale are increasingly complex designs combined with a continued preference for bottom-up (i.e., transistor-level) design methodology and the occurrence of simulation late in the design cycle, which leads to errors and re-spins. There's a huge disparity in productivity between those mixed-signal designers who have transitioned to a “top-down” design methodology and use mixed-signal hardware description languages (MS-HDLs), and those who practice “bottom-up” design and rely solely on SPICE.
Moore’s observation that the number of transistors available on an integrated circuit doubles every 18 to 24 months continues to hold. Competitive pressures compel designers to use these transistors to either provide additional functionality and to increase the integration level and thereby decreasing the size, weight, power and cost of the product. As a result, designers are confronted with larger and more complex designs. The increasing size and complexity of these designs combines with the shrinking time available to develop and get them to market; making the job of the circuit designer today much more difficult than in the past. Circuits are getting more complex in two different ways at the same time. First, circuits are becoming larger. Consider wireless products. 30 years ago a typical receiver contained between 5 and 10 transistors whereas it is common for a modern cell phone to contain 10M transistors. Second, the operation of the circuits
are becoming more complex. 30 years ago integrated circuits generally consisted of simple functional blocks such as opamps and gates. Verification typically required simulating the block for two or three cycles. Today, mixed-signal chips implement complex algorithms that require designers to examine their operation over thousands of cycles. Examples include PLLs, SD converters, magnetic storage PRML channels, and CDMA transceivers. The result of these two effects together is that complexity is increasing at a blistering pace, and is outstripping the designers ability to keep up. For example, 30 years ago you could build a radio from a rock and a wire, whereas to build a modern radio requires more compute power than existed in the largest supercomputer available back then. The CAD tools and computers employed by designers continually improve, which serves to increase the productivity of designers. However, the rate of productivity increase is not sufficient to allow the designers to keep up with the increasing complexity of designs and decreasing time-to-market requirements. The growing difference between the improvement in productivity needed to satisfy the demands of the market and the productivity available simply by using the latest CAD tools and computers is referred to as the Design Productivity Gap. To close this gap, one must change the way design is done. A design style that reduces the number of serial steps, increases the likelihood of first time working silicon, and increases the number of designers that can work together effectively is needed. If a design group fails to move to such a design style, it will become increasingly ineffective. It eventually will be unable to get products to market in a time of relevance and so will be forced out of the market.
i.Bottom-Up Design The traditional approach to design is referred to as bottom-up design. In it, the design process starts with the design of the individual blocks, which are then combined to form the system. The design of the blocks starts with a set of specifications and ends with a transistor level implementation. At this point, each block is verified as a stand-alone unit against specifications and not in the context of the overall system. Once verified individually, the blocks are then combined and verified together, but at this point the entire system is represented at the transistor level. While the bottom-up design style continues to be effective for small designs, large designs expose several important problems in this approach. • Once the blocks are combined, simulation takes a long time and verification becomes difficult and perhaps impossible. The amount of verification must be reduced to meet time and compute constraints. Inadequate verification may cause projects to be delayed because of the need for extra silicon prototypes. • For complex designs, the greatest impact on the performance, cost and functionality is typically found at the architectural level. With a bottom-up design style, little if any architectural exploration is performed, and so these types of improvements are often missed. • Any errors or problems found when assembling the system are expensive to fix because they involve redesign of the blocks. • Communication between designers is critical, yet an informal and error prone approach to communication is employed. In order to assure the whole design works properly when the blocks are combined, the designers must be in close proximity and must communicate
often. With the limited ability to verify the system, any failure in communication could result in the need for additional silicon prototypes. • Several important and expensive steps in the bottom-up design process must be performed serially, which stretches the time required to complete the design. Examples include system-level verification and test development. The number of designers than can be used effectively in a bottom-up design process is limited by the need for intensive communication between the designers and the inherently serial nature of several of the steps. The communication requirements also tend to require that designers be co-located.
ii. Top-Down Design In order to address these challenges, many design teams are either looking to, or else have already implemented, a topdown design methodology. In a basic top-down approach, the architecture of the chip is defined as a block diagram and simulated and optimized using either a MS-HDL simulator or a system simulator. From the high-level simulation, requirements for the individual circuit blocks are derived. Circuits are then designed individually to meet these specifications. Finally, the entire chip is laid out and verified against the original requirements. This represents the widely held view of what top-down design in. And while this is a step towards top-down design, it only addresses one of the issues with bottom-up design and
Difficulties in the mixed signal system
Some of the challenges for filling the Design Gap in mixed signal area Availability of Virtual Components guarantee functionality, performance and production yield within short time delays Power supply voltage decrease Signal Integrity: noise from power, interconnections, substrate, Industrial testability Migration of the design towards various processes Introducing new features or modifying performance mixed signal and multilevel simulations MOS Matching Versus Technolgy Size
ROM Yield Versus Technology Size
Solutions About DesignWare Mixed-Signal IP Synopsys enables designers to quickly integrate analog Mixed-Signal IP (MSIP) into next-generation system-on-chips (SoCs) with a comprehensive portfolio of highperformance PHYs for the PCI Express, SATA, XAUI, and USB protocols. In addition, the MSIP offering also includes a complete suite of I/O Libraries. Available for industryleading processes, DesignWare Mixed-Signal IP meets the needs of today's high-speed designs for the networking, storage, computing, and consumer electronics markets. The DesignWare MSIP offering is complemented by a comprehensive suite of digital controller cores and verification IP to provide chip developers with a complete solution
for SoC integration. Each MSIP can be licensed individually, on a fee-per-project basis or customers can opt for the Volume Purchase Agreement, which enables them to license all the MSIP in one simple agreement. All these issues, specific to the Analog and mixed signal world, add to the usual SoC design flow new challenges listed on the slide:
to guarantee functionality, performance and production yield within short time delays to cope with power supply voltage decrease Evaluate the impact of noise on analog performances Take into account the industrial testability To be able to quickly migrate a design towards various processes while introducing new features or modifying its performance -
> the capability of doing efficient mixed signal simulations all along the design process.
All these requirements have to be fulfilled with a high level of quality implying bugfree, reliable solutions and high production yields.
In general, a test engineer has 4 things he wants to optimize: (1) Test cost, (2) Yield, (3) Time to volume/Time to Market, and (4) Defective-Parts-Per- Million (DPPM) of shipped devices. The relative importance of each of these goals depends on market segment, maturity of the product and a host of other factors. Test cost is minimized by optimizing test time, and by choosing low-cost test platforms. There are also other factors
involved in test cost, including burnin requirements and the need for testing at multiple temperatures. Optimization of yield and DPPM often involves improving test reliability to minimize guardbands, as well as ensuring high test coverage. Timeto-volume constraints, often imposed by market requirements, put severe pressure on test engineers to develop reliable low-cost test solutions in minimum time. A large percentage of the next generation ASICs will have significant analog or RF content. The challenges in mixed-signal test include testing of (1) high-speed serial interfaces (2) Increased performance analog channel and data converters and (3) Ultra-Wideband (UWB) radios. Parallel low-speed interfaces in ASICs are increasingly being replaced with high-speed serial interfaces, such as PCI-express, RapidIO, and USB2.0. The specifications of these high-speed interfaces include measurement of analog parameters, such are rise/fall time, eye-opening, receiver sensitivity and receiver jitter tolerance. These highspeed interfaces challenge the performance of even the highest performance Automatic Test Equipment (ATE). However, the test cost constraints on many of these ASICs have meant these parts have to be tested on low-cost testers.
Conclusion Mixed Signal design is a formal design process that requires a serious commitment throughout the entire design process. It is not a piece of software or something you do in your spare time. It is not a way to reduce headcount or something you can try after the design is complete. It is considerably more than simply doing the initial design of the block diagram with Simulink and it is not
something you can be successful at without a significant investment in time and training. However, it is much easier the second time around and once mastered provides a dramatic return. Using top-down design usually results in needing fewer design iterations, which provides a more predictable design process. It also results in more optimal designs in a shorter time. Finally, it allows design teams to be larger and more dispersed, giving the option of trading a higher initial investment for a shorter time-to-production
Reference:L Bindra, Ashok. (January 2002). Analog/Mixed-Signal ICs: Introduction/Analog-toDigital Converter. http://www.elecdesign.com/Articles/Index.cfm?AD=1&ArticleID=1424 David Johns and Ken Martin University of Toronto http://www.allaboutcircuits.com/vol_4/chpt_13/index.html Pett, J. G. (2002) AD/DA Conversion Techniques An Overview Wikipedia