You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In your paper, you mentioned different kinds of methods for generating event space. But in the implementation of algorithms, it seems you already know which event is going to violate the differential privacy.
Take histogram for example, your implementation of the algorithm is: def histogram(queries, epsilon): noisy_array = np.asarray(queries, dtype=np.float64) + np.random.laplace(scale=1.0 / epsilon, size=len(queries)) return noisy_array[0]
How do you know in advance that the quantization of the first element of the noisy vector would cause violation of differential privacy?
Thank you!
The text was updated successfully, but these errors were encountered:
In your paper, you mentioned different kinds of methods for generating event space. But in the implementation of algorithms, it seems you already know which event is going to violate the differential privacy.
Take histogram for example, your implementation of the algorithm is:
def histogram(queries, epsilon): noisy_array = np.asarray(queries, dtype=np.float64) + np.random.laplace(scale=1.0 / epsilon, size=len(queries)) return noisy_array[0]
How do you know in advance that the quantization of the first element of the noisy vector would cause violation of differential privacy?
Thank you!
The text was updated successfully, but these errors were encountered: