A differentially private random decision forest using reliable signal-to-noise ratios

Samuel Fletcher, Md Zahidul Islam

Research output: Book chapter/Published conference paperConference paperpeer-review

19 Citations (Scopus)
5 Downloads (Pure)


When dealing with personal data, it is important for data miners to have algorithms available for discovering trends and patterns in the data without exposing people’s private information. Differential privacy offers an enforceable definition of privacy that can provide each individual in a dataset a guarantee that their personal information is no more at risk than it would be if their data was not in the dataset at all. By using mechanisms that achieve differential privacy, we propose a decision forest algorithm that uses the theory of Signal-to-Noise Ratios to automatically tune the algorithm’s parameters, and to make sure that any differentially private noise added to the results does not outweigh the true results. Our experiments demonstrate that our differentially private algorithm can achieve high prediction accuracy.
Original languageEnglish
Title of host publicationAI 2015: Advances in Artificial Intelligence
Subtitle of host publication28th Australasian Joint Conference Proceedings
EditorsBernhard Pfahringer , Jochen Renz
Place of PublicationGermany
Number of pages12
ISBN (Electronic)9783319263502
ISBN (Print)9783319263496
Publication statusPublished - 2015
EventAustralian Joint Conference on Artificial Intelligence - Canberra, Australia, Australia
Duration: 30 Nov 201504 Dec 2015

Publication series

NameLecture Notes in Artificial Intelligence
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


ConferenceAustralian Joint Conference on Artificial Intelligence


Dive into the research topics of 'A differentially private random decision forest using reliable signal-to-noise ratios'. Together they form a unique fingerprint.

Cite this