cgoo.py: Contains code for running fair regression differentially private algorithms.

1)  gen_data_lin_norm: generates data from a normally distributed independent variable.
2)  gen_data_lin_unif: generates data from a uniformly distributed independent variable.
3)  gen_data_lin_exp: generates data from an exponentially distributed independent variable.
4)  l(c, K, data): calculates per-group MSPE (Mean Squared Prediction Error).
5)  pl(c, K, data, rho): calculates per-group MSPE (Mean Squared Prediction Error) w/ rho-zCDP guarantees.
6)  pl_se(c, K, data, rho, s1, s2): for 2 groups, uses standard errors -- s1, s2 -- to distribute budget of rho.
7)  grad_l(c, K, data): calculates gradient of per-group MSPE.
8)  pgrad_l(c, K, data, rho): calculates gradient of per-group MSPE using budget of rho.
9)  pgrad_l_se(c, K, data, rho, s1, s2): uses standard error information to calculate pgrad_l.
10) f(c, K, data): sums all losses for all groups.
11) grad_f(c, K, data): compute gradient of f.
12) pgrad_f(c, K, data): privately compute gradient of f.
13) g(c, K, data): compute loss for smallest group.
14) grad_g(c, K, data): compute gradient of g.
15) pgrad_g(c, K, data, rho): privately compute gradient of g using rho-zCDP.
16) smooth_max: computes smooth maximum.
17) pgdcgoo: compute with privacy and fairness.
18) pgdcgoo_nonfair: compute with privacy w/o fairness.
19) pgdcgoo_nonfair_se: compute pgdcgoo_nonfair w/ standard error information.
22) mspe: computes Mean Squared Prediction Error.
23) standard_errors: Computes the standard error.
24) synthetic_datasets: Generates synthetic datasets for 2 groups of size (n1, n2)
    and slopes (slope1, slope2) using trial runs of 'num_trails' and variance vare
    for the dependent variable.
25) simulation: goes through rhos, runs functions, and collects results.

Other functions that go through real-world datasets (instead of synthetic): real_datasets, getlawschooldata