This thesis presents the effect of dispersion and Multi Access Interference (MAI) of optical fiber on the Bit Error Rate (BER) performance of a Direct Sequence Optical Code Division Multiple Access (DS-OCDMA) network by means of intensity modulation and optical receiver correlators. By using Matlab simulations, Signal-to-Noise Ratio (SNR) versus Received Optical Power (ROP) of an OCDMA transmission system can be evaluated with a so-called 7-chip m-sequence for different numbers of system users. This can be done for the ROP versus BER for various lengths of single mode optical fiber by taking into consideration the dispersion effect in the optical fiber. Matlab simulations can be performed in order to illustrate the reduction of the dispersion index gamma, or to visualize different scenarios, e.g., what amount of transmitted power is required in order to obtain a BER of 10-9 when the length of the optical fiber is increased.