Automatic and remote monitoring of body temperature through multi-view eye localisation could be very valuable to the cattle industry. Many studies have described that cattle's rising body temperature can be a sign of fever or local inflammation. Therefore, it would be beneficial for cattle producers to regularly measure cattle's body temperature to help identify at-risk cattle, make better informed management decisions, and maintain their mob's commercial value. However, under commercial conditions, it can be difficult to accurately measure the temperature of large mobs of cattle in these production environments. Traditionally, methods used to measure body temperature were invasive and time-consuming, and there has been a demand for an alternative method for body temperature measurement. Hence, automated processes can be used to measure cattle's body temperature efficiently and accurately, and it would be of great benefit to the cattle industry. The infrared Thermography Technique (IRT) has the capability to measure the body temperature of cattle by analysing an infrared thermal image of the eye region in real-time. IRT has played a vital role in the medical sector by developing automated systems for fever, infection detection, and musculoskeletal disorders. Most of the previously proposed eye localization methods used frontal position only in humans, and manual localisation in cattle. Our contributions include proposing thermal image processing for eye segmentation and a computer vision method for multi-view face detection in cattle. In the computer vision method, three classifiers were created for multi-view face detection, which uses Histogram Oriented Gradient (HOG) as features and Support vector machine (SVM) as the classifier. While in image processing, a new automated eye segmentation method based on thresholding techniques is proposed. Proposed methods work together in the indoor environment to localize the eye region of cattle by measuring temperature from different orientations, angles, and positions. The paper results showed that the proposed method has high accuracy, with the following average sensitivity of sensitivity 0.9780, precision 0.7212, F measure of 0.8024, and misclassification 0.0455.