Computer-assisted system to guide a surgical / diagnostic instrument (27) in the body of a patient placed on a support structure including: - a first patient marker device (22) configured to contact the patient so that is integral with a region of the patient's body (P) to be treated by means of said surgical / diagnostic instrument (27) and including first marker elements (M1); - a second instrument marker device (26) configured to engage said surgical / diagnostic instrument (27) and including second marker elements (M2); - an optical tracking sensor (20) configured to locate said first and second markers (M1, M2); and - a processing unit (4) connected to the optical tracking sensor (20) and adapted to perform virtual navigation based on an imported / reconstructed three-dimensional image and the image detected by the optical tracking sensor (20) to see in a vision unit (8): - a three-dimensional representation of the body region generated based on said repositioned three-dimensional image in relation to the optical tracking sensor (20); and - a three-dimensional representation of at least an operative portion (25b) of the surgical / diagnostic instrument (27), graphically superimposed on the three-dimensional representation of the patient's body region using the position of the second marker elements (M2) and of a model of the surgical instrument, said optical tracking sensor (20) being provided with a stereoscopic vision system in which a first infrared ray camcorder (42-a) associated with a first infrared ray illuminator (43-a) is supported by a first end (40-a) of an elongated support element (40) and a second infrared ray camcorder (42-b) associated with a second infrared illuminator (43-b) is supported by a second end (40-b) of the elongated support element (40); the camcorders (42-a and 42-b) having respective axes (47- a and 47-b) tilted towards each other and converging in a vision zone (FOV) of the optical tracking sensor (20); said optical tracking sens