Lab 2 - EE 421L
Email: delape19@unlv.nevada.edu
Pre-Lab Work
1) After downloading the lab2 library, using the Cross-Library Copy menu command will allow you to obtain the ideal 10-bit ADC and DAC. The following is what your library will look like after the copy.
2) Select "sim_ADC_DAC(sch)
3) Run the simulation and plot the Vin and Vout traces
Simulation Results
From the above simulation, the voltage coming appears as a smooth waveform. However, the voltage leaving the device appears as a jagged waveform. This change is the basic operation of a DAC converter.
The least significant bit (LSB) of an ADC's input is the minimum voltage change. LSB can also be defined as smallest change in the DAC's output. The number of bits in the converter, n, determine the LSB through the following equation: VDD over 2 to the power n.
In this circuit:
The time delay = 0.7RC = 0.7 * 10e3 * 10e-12 = 70e-9 or 70 nanoseconds.
1 LSB = VDD / 2^n = 5 / 2^10 or 0.977 mVSince the pulse is connected at the top, the digital code is 10 0000 0000 or 1024.
Vout = Vin/2 or 2.5 V
The simulation results are as follows:
4)
In a real circuit, transistors (MOSFETs) are used inplace of switched
at the outputs of the ADC. Since the resistance of the MOSFETS are not
smalll compared to R, the out resisteance is affected, changing the
time delay of the circuit as well as the output wave form