Long duration acoustic monitoring is becoming an increasingly popular approach to extend survey effort by using autonomous sensors to passively collect data over large temporal and spatial scales. This is of particular benefit when attempting to detect a species whose temporal vocalization strategy is unknown, and whose small population size reduces detection probability. It is also of benefit in environments that are logistically difficult to access such as wetlands. We investigated the vocalization strategy of the Least Bittern (Ixobrychus exilis), a species of high conservation concern in the Western hemisphere and ‘in need of management’ in multiple states of the USA. The Least Bittern is a secretive marsh bird that is primarily detected by its vocalizations and call-playback surveys are typically used for population monitoring. To minimize disturbance to both the birds and their habitat, we deployed autonomous acoustic recording units and collected continuous 24-hour audio recordings for 30 days. The resultant accumulation of data necessitated an automated method to assist with analysis and interpretation. We successfully applied a novel soundscape technique—long-duration, false-color (LDFC) spectrograms—to visually confirm presence of Least Bittern from the ‘coo coo coo’ vocalization associated with breeding. In addition, we used a machine learning technique to automate the acoustic event detection process. Peak vocalization times were then predicted from an annotated dataset of actual calls and subsequently used to develop an optimal acoustic survey strategy. The results of this research demonstrate how machine learning methods can search large data sets for a specific species. This information can then be used to optimize existing monitoring methods, to increase detection probability and to minimize associated costs.