Recent methods to infer genetic networks are based on identifying gene interactions by similarities in expression profiles. These methods are founded on the assumption that interacting genes share higher similarities in their expression profiles than non-interacting genes. In this dissertation this assumption is validated when using mutual information as a similarity measure. Three algorithms that calculate mutual information between expression data are developed: 1) a basic approach implemented with the histogram technique; 2) an extension of the basic approach that takes into consideration time delay between expression profiles; 3) an extension of the basic approach that takes into consideration that genes are regulated in a complex manner by multiple genes. In our experiments we compare the mutual information distributions for profiles of interacting and non-interacting genes. The results show that interacting genes do not share higher mutual information in their expression profiles than non-interacting genes, thus contradicting the basic assumption that similarity measures need to fulfil. This indicates that mutual information is not appropriate as similarity measure, which contradicts earlier proposals.