Lab 2 - EE 421L 

Authored by: Adam James Wolverton,

Email: Wolvert9@unlv.nevada.edu

Date: September 5, 2013

  


Lab Description:

 This second laboratory experiment will go through the design of a 10-bit digital-to-analog converter (DAC). The lab report consists of:

    1) Provide narrative of the steps seen in the prelab.

    2) Provide and discuss simulation results different from the prelab to illustrate your understanding of the ADC and DAC.

    3) Explain how you determine the least significant bit (LSB, the minimum voltage change on the ADC's input to see a change in the digital code B[9:0]) of the converter. Use simulations to support your understanding.


Prelab Work with Narrative:

1) I downloaded the lab2.jelib file and opened the library in Electric, then used Cross-Library Copy to copy the schematic file for the ADC/DAC into the ee421_ecg621.jelib library.

    

 

    


2) Next I saved the course/lab jelib so that it now will contain the copied cells of the ideal 10-bit ADC and DAC.

     


    


3) After running the simulation, I get the following waveforms for the input and output. B[9:0] refers to a digital encoding for the analog Vin. And Vout reconverts it.

     


    


Understanding of ADC and DAC:

    

The original signal is a "beautiful" sinusoid with a 5 Volt peak to peak as seen below. This is prior to entering the ADC.

    

                

   

The signal is then converted to a digital signal as shown on 10 seperate plot planes below. If # of bits increase, then the quality of resolution increases.

        

   

       

Then those 10 pins go through a DAC which will convert the digital signal back into an analog signal. However, you lose some precision looking at and comparing Vin and Vout to each other in the simulation as shown below.

    

        

   

Determining Least Significant Bit

In order to determine the LSB you must take the positive reference voltage subtract the negative reference voltage and then devide by 2^n, where n is the number of bits. Since we have 10 bit converters, we get that 1LSB is equal to 4.88 mV.

    


    

Lab Report:

1)    Using the 5-bit (expanded to 10-bits) resistor DAC Topology given in Figure 30.14, I implemented this into my own DAC. as seen below, where the all of the resistor values are 10KOhm. After the resistor schematic was created I placed a copy of my personal IC into the ADC to DAC circuit to test out the output. DRC checked out with out errors. 

    

        

2)    How do we determine the output resistance of the DAC? In order to determine the output resistance, one should combine the resistors in parallel and series to reduce to eventually a resistance of R.  This is only if all resistors used in this design are the same value. 

    

3)    Delay, while driving a load: The Time delay we get when driving a 10pF load, is about 100ns, however the simulation results shows that it is more about 80ns. 

    

    

4)    The following waveforms show how the DAC reacts with a capacitive load, resistive load, and an RC load in that particular order. 

    

Notice the phase shift and smoothing of the choppy signal due to the capacitive load. 

         

In this waveform, we have the resistive load, which shows the voltage drop, but still seems digitized, but still in phase. 

        

This wave form shows the RC load, which is a voltage drop and smoothed out by the capacitor as well as phase shifted.  In conclusion we see that the non ideal DAC does not work well when under any load. 

        

5) When the switches in the DAC are implemented with MOSFETs, we would want the resistance of those MOSFETs to be small compared to the resistor values, because voltage drops and time delay can cause the accuracy/reliability of the DAC to be questionable.

    

    

    

   

Return to Adam's EE 421L Labs