Method of processing optical coherence tomography (OCT) scans, comprising: receiving OCT data, indicating level of scattering in a sample, including the OCT signal for at least one scan through the sample; processing the OCT data for each scan to produce an indicative depth scan representing the OCT signal at each depth through all of the scans; fitting a curve, comprising a term which decays exponentially with depth and a second term which depends on the noise in the OCT signal, to the indicative depth scan; calculating a compensated intensity of the OCT signal, for each point in each scan, comprising a ratio of a term comprising the logarithm of the OCT signal to a term comprising the logarithm of the fitted curve. Indicative depth scan may be the mean A-scan, representing the mean OCT intensity versus depth across all the scans. Compensated intensity may be used to generate pixels of an image. Sample (eg. skin) surface position may be determined by detecting sudden intensity changes. A roll-off factor may vary the compensation reduction with increasing depth. Bilateral and despeckle filters remove grain from the picture. Image may be coloured based on difference between compensated intensity and fitted curve.