Aaditya Ramdas

These keywords quickly get my attention:.

e-values and confidence sequences (e-processes, supermartingales, testing by betting, sequential inference, optional stopping, peeking and p-hacking, change detection, anytime p-values, Ville's inequality, game-theoretic statistics)

conformal prediction and calibration (distribution-free inference, uncertainty quantification for black-box machine learning, covariate/label shift, beyond exchangeability)

multiple testing and post-selection inference (false discovery rate, inference after model selection, online or interactive or bandit testing, post-hoc simultaneous inference)

high-dimensional, nonparametric statistics and machine learning (kernel methods, minimax rates, dimension-agnostic inference, universal inference, differential privacy, optimization)

Courses, Workshops, Tutorials, Software, Talks, etc.

Aaditya Ramdas (PhD, 2015) is an assistant professor at Carnegie Mellon University, in the Departments of Statistics and Machine Learning. He was a postdoc at UC Berkeley (2015–2018) mentored by Michael Jordan and Martin Wainwright, and obtained his PhD at CMU (2010–2015) under Aarti Singh and Larry Wasserman, receiving the Umesh K. Gavaskar Memorial Thesis Award. His undergraduate degree was in Computer Science from IIT Bombay (2005-09).

Aaditya received the 2024 Sloan fellowship in mathematics , the IMS Peter Gavin Hall Early Career Prize (2023) , was an inaugural recipient of the COPSS Emerging Leader Award (2021) , and a recipient of the Bernoulli New Researcher Award (2021) . His work is supported by an NSF CAREER Award (2020) , an Adobe Faculty Research Award (2019) , a Google Research Scholar award (2022) . He was a CUSO lecturer in 2022 , a Lunteren lecturer in 2023 , and a keynote speaker at AISTATS 2024 .

Aaditya's research is in mathematical statistics and learning, with an eye towards designing algorithms that both have strong theoretical guarantees and also work well in practice. His main research interests include selective and simultaneous inference (interactive, structured, online, post-hoc control of false decision rates, etc), game-theoretic statistics (sequential uncertainty quantification, confidence sequences, always-valid p-values, safe anytime-valid inference, e-processes, supermartingales, etc), and distribution-free black-box predictive inference (conformal prediction, post-hoc calibration, etc). His areas of applied interest include privacy, neuroscience, genetics and auditing (elections, real-estate, financial, fairness), and his group's work has received multiple best paper awards.

He is one of the organizers of the amazing and diverse StatML Group at CMU. Outside of work, some easy topics for conversation include travel/outdoors (hiking, scuba, etc.), trash-free living , completing the Ironman triathlon and long-distance bicycle rides .

Curriculum Vitae

Preprints (under review or revision)

Combining exchangeable p-values (with M. Gasparin, R. Wang).       arXiv | TLDR

Conformal online model aggregation (with M. Gasparin).       arXiv | TLDR

The numeraire e-variable and reverse information projection (with M. Larsson, J. Ruf).       arXiv | TLDR

Combining evidence across filtrations using adjusters (with Y.J. Choe).       arXiv | TLDR

Distribution-uniform strong laws of large numbers (with I. Waudby-Smith, M. Larsson).       arXiv | TLDR

Positive semidefinite supermartingales and randomized matrix concentration inequalities (with H. Wang).       arXiv | TLDR

Merging uncertainty sets via majority vote (with M. Gasparin).       arXiv | TLDR

Sequential Monte-Carlo testing by betting (with L. Fischer).       arXiv | TLDR

Time-uniform confidence spheres for means of random vectors (with B. Chugg, H. Wang).       arXiv | TLDR We use PAC-Bayesian techniques to derive multivariate confidence sequences for means, in both light and heavy tailed settings, including a multivariate empirical-Bernstein inequality for bounded random variables.

Distribution-uniform anytime-valid inference (with I. Waudby-Smith).       arXiv | TLDR We give the first distribution-uniform guarantees for asymptotic anytime-valid procedures by deriving a novel and uniformly valid strong Gaussian approximation theorem.

Time-uniform self-normalized concentration for vector-valued processes (with J. Whitehouse, S. Wu).       arXiv | TLDR We derive new self-normalized confidence sequences for multivariate means under a variety of sub-psi tail conditions, including a new multivariate empirical-Bernstein inequality for bounded observations.

Anytime-valid t-tests and confidence sequences for Gaussian means with unknown variance (with H. Wang).       arXiv | TLDR We derive and compare various one-sided and two-sided anytime-valid t-tests and analyze the width of the corresponding confidence sequences, including pointing out that an improper/flat mixture approach of Lai is rather suboptimal in practice.

On the near-optimality of betting confidence sets for bounded means (with S. Shekhar).       arXiv | TLDR We derive information theoretic lower bounds on the width of any confidence interval or confidence sequence for the mean of a bounded random variable, and show that recent betting-based confidence sets essentially achieve that lower bound.

Scalable causal structure learning via amortized conditional independence testing (with J. Leiner, B. Manzo, W. Tansey).       arXiv | code | TLDR We develop a method that leverages discrete optimization techniques for efficient causal discovery, and apply it to a cancer dataset to reveal connections between somatic gene mutations and metastases to different tissues.

Reducing sequential change detection to sequential estimation (with S. Shekhar).       arXiv | TLDR We reduce sequential change detection to sequential estimation using confidence sequences, much like Lorden's classical reduction to sequential testing.

Total variation floodgate for variable importance inference in classification (with W. Wang, L. Janson, L. Lei).       arXiv | TLDR We define the expected total variation (ETV), which is an intuitive, model-agnostic measure of variable importance and design algorithms for statistical inference on the ETV under design-based/model-X assumptions.

More powerful multiple testing under dependence via randomization (with Z. Xu).       arXiv | TLDR We show that the power of three procedures for multiple testing under arbitrary dependence --- Benjamini-Yekutieli, e-BH and Simes --- can be strictly improved using a simple uniform external randomization.

Randomized and exchangeable improvements of Markov's, Chebyshev's and Chernoff's inequalities (with T. Manole).       arXiv

The extended Ville's inequality for nonintegrable nonnegative supermartingales (with H. Wang).       arXiv | TLDR We present a rigorous study of supermartingales and their maximal inequalities without the traditional L1-integrability assumption, which typically arise as infinite mixtures of traditional, L1-integrable martingales; applications include deriving "frequentist" confidence sequences despite using Bayesian flat, non-informative priors.

A sequential test for log-concavity (with A. Gangrade, A. Rinaldo) .       arXiv

Admissible anytime-valid sequential inference must rely on nonnegative martingales (with J. Ruf, M. Larsson, W. Koolen).       arXiv

Time-uniform central limit theory and asymptotic confidence sequences (with I. Waudby-Smith, D. Arbour, R. Sinha, E. H. Kennedy).       arXiv | code

Post-selection inference for e-value based confidence intervals (with Z. Xu, R. Wang).       arXiv | talk | slides | TLDR We propose the e-BY procedure for post-selection inference (false coverage rate of chosen confidence intervals), which allows the Benjamini-Yekutieli procedure be applied under both arbitrary selection rules and dependence without an extra log factor correction when using e-value based CIs.

Interactive identification of individuals with positive treatment effect while controlling false discoveries (with B. Duan, L. Wasserman).       arXiv

Multiple testing under negative dependence (with Z. Chi, R. Wang).       arXiv

On the existence of powerful p-values and e-values for composite hypotheses (with Z. Zhang, R. Wang).       arXiv

Published (or accepted) papers

Universal inference meets random projections: a scalable test for log-concavity (with R. Dunn, A. Gangrade, L. Wasserman), J Comp & Graphical Stat, 2024.       arXiv | code | TLDR We develop the first provably valid test for log-concavity, which is a shape constraint for density estimation with applications across economics, survival modeling, and reliability theory. The test is based on universal inference and a scalable variant using random projections is developed.

De Finetti's Theorem and related results for infinite weighted exchangeable sequences (with R. Barber, E. Candes, R. Tibshirani), Bernoulli, 2024       arXiv

Semiparametric efficient inference in adaptive experiments (with T. Cook, A. Mishler), Conference on Causal Learning and Reasoning (CLeaR), 2024.       arXiv | TLDR We show how to achieve semiparametric efficient confidence intervals and confidence sequences for the average treatment effect when performing adaptive experimentation, based on the Adaptive AIPW (A2IPW) estimator of Kato.

Anytime-valid off-policy inference for contextual bandits (with I. Waudby-Smith, L. Wu, N. Karampatziakis, P. Mineiro), ACM/IMS J of Data Science, 2024 .       arXiv | proc

Testing exchangeability by pairwise betting (with A. Saha), Intl. Conf. on AI and Statistics (AISTATS), 2024. oral talk       arXiv | TLDR As an alternative to universal inference and conformal prediction, we propose a new method for testing exchangeability which uses betting on pairs of observations, which applies to general observation spaces and can be shown to be consistent against reasonable alternatives.

Graph fission and cross-validation (with J. Leiner), Intl. Conf. on AI and Statistics (AISTATS), 2024       arXiv | TLDR We extend data fission/thinning to the graphical setting to develop a procedure that takes in an input graph with noisy observations at each node, and creates multiple independent synthetic copies of the graph with the same node+edge set as the original and noisier observations at each node. We then show how this method can be used for cross-validation and structural trend estimation on graph data.

Online multiple testing with e-values (with Z. Xu), Intl. Conf. on AI and Statistics (AISTATS), 2024.       arXiv | TLDR We design an extension of the LOND algorithm for online FDR and FCR control with e-values.

Deep anytime-valid hypothesis testing (with T. Pandeva, P. Forré, S. Shekhar), Intl. Conf. on AI and Statistics (AISTATS), 2024.       arXiv

Differentially private conditional independence testing (with I. Kalemaj, S. Kasiviswanathan), Intl. Conf. on AI and Statistics (AISTATS), 2024.       arXiv | TLDR We derive differentially private versions of the generalized covariance measure (GCM), in both observational settings and under the model-X assumption, the primary difficulty of which arises from the fact that changing one datapoint changes all the GCM residuals.

E-detectors: a nonparametric framework for online changepoint detection (with J. Shin, A. Rinaldo), New England J of Stat. and Data Science, 2023 .       arXiv | proc

A unified recipe for deriving (time-uniform) PAC-Bayes bounds (with B. Chugg, H. Wang), J of ML Research, 2023.       arXiv | proc

A permutation-free kernel independence test (with S. Shekhar, I. Kim), J of ML Research, 2023.       arXiv | code | proc | TLDR We propose a new kernel-HSIC statistic that drops half the terms of the original, and show that it has a standard normal limiting null distribution under low and high dimensional regimes, resulting in a test that is trivial to calibrate without permutations. The analysis of this statistic requires generalizing the techniques we developed earlier for the corresponding kernel-MMD statistic, to deal with the more complicated dependence structure.

Data fission: splitting a single data point (with J. Leiner, B. Duan, L. Wasserman), J of American Stat Assoc, 2023 arXiv | proc | poster | slides | code | talk | TLDR (Discussion paper) We devise an alternative to data splitting using external randomization called data fission that more efficiently splits information in many circumstances and then apply it to several examples in post-selection inference: interactive multiple testing, fixed-design linear regression, generalized linear models, and trend filtering.

A composite generalization of Ville's martingale theorem using e-processes (with J. Ruf, M. Larsson, W. Koolen), Elec. J. of Prob., 2023 arXiv | proc | TLDR Ville's famous martingale theorem relates measure-theoretic probability to betting: it states that for any event of measure zero, there exists a nonnegative martingale (a betting strategy) that multiplies its initial wealth infinitely if the event occurs. We prove a composite generalization of that theorem, which requires generalizing ``measure zero'' to a certain inverse capital outer measures and generalizing ``nonnegative martingale'' to e-processes.

Online multiple hypothesis testing (with D. Robertson, J. Wason), Statistical Science, 2023 arXiv | proc

Nonparametric two-sample testing by betting (with S. Shekhar), IEEE Trans. on Info. Theory, 2023       arXiv | proc | code | slides | TLDR We develop a general framework for designing sequential two-sample tests, and obtain a general characterization of the power of these tests in terms of the regret of an associated online prediction game. This yields the ``right'' sequential generalizations of many offline nonparametric two-sample tests like Kolmogorov-Smirnov or kernel-MMD.

E-values as unnormalized weights in multiple testing (with N. Ignatiadis, R. Wang), Biometrika, 2023 arXiv | proc

Comparing sequential forecasters (with Y.J. Choe), Operations Research, 2023 arXiv | proc | code | talk | poster | slides (Citadel, Research Showcase Runner-up)

Game-theoretic statistics and safe anytime-valid inference (with P. Grunwald, V. Vovk, G. Shafer), Statistical Science, 2023 arXiv | proc

Adaptive privacy composition for accuracy-first mechanisms (with R. Rogers, G. Samorodnitsky, S. Wu), Conf. on Neural Information Processing Systems (NeurIPS), 2023 arXiv | proc | TLDR We derive basic and advanced composition results and privacy filters for noise-reduction mechanisms that allow an analyst to adaptively switch between differentially private and ex-post private mechanisms subject to an overall privacy guarantee.

Sequential predictive two-sample and independence testing (with A. Podkopaev), Conf. on Neural Information Processing Systems (NeurIPS), 2023 arXiv | proc

Auditing fairness by betting (with B. Chugg, S. Cortes-Gomez, B. Wilder), Conf. on Neural Information Processing Systems (NeurIPS), 2023 arXiv | code | proc

Counterfactually comparing abstaining classifiers (with Y. J. Choe, A. Gangrade), Conf. on Neural Information Processing Systems (NeurIPS), 2023 arXiv | slides | proc

An efficient doubly-robust test for the kernel treatment effect (with D. Martinez-Taboada, E. Kennedy), Conf. on Neural Information Processing Systems (NeurIPS), 2023 arXiv | proc

On the sublinear regret of GP-UCB (with J. Whitehouse, S. Wu), Conf. on Neural Information Processing Systems (NeurIPS), 2023 arXiv | TLDR By appropriately regularizing simple confidence sequences in Hilbert spaces, we derive (for the first time) sublinear regret for GP-UCB for any kernel with polynomial decay (including Matern).

Martingale methods for sequential estimation of convex functionals and divergences (with T. Manole) , IEEE Trans. on Information Theory, 2023 arXiv | article | talk (Student Research Award, Stat Soc Canada) | TLDR We derive confidence sequences for convex functionals, with an emphasis on convex divergences such as the kernel Maximum Mean Discrepancy and Wasserstein distances; our main technical contribution is to show that empirical plugins of convex functionals/divergences (and more generally processes satisfying a leave-one-out property) are partially ordered reverse submartingales, coupled with maximal inequalities for such processes.

Estimating means of bounded random variables by betting (with I. Waudby-Smith), J. of the Royal Statistical Society, Series B, 2023 arXiv (Discussion paper) | proc | code

Fully adaptive composition in differential privacy (with J. Whitehouse, R. Rogers, Z. S. Wu), Intl. Conf. on Machine Learning (ICML), 2023 arXiv

Online Platt scaling with calibeating (with C. Gupta), Intl. Conf. on Machine Learning (ICML), 2023 arXiv

A nonparametric extension of randomized response for locally private confidence sets (with I. Waudby-Smith, Z. S. Wu), Intl. Conf. on Machine Learning (ICML), 2023 arXiv | code (oral talk)

Sequential kernelized independence testing (with A. Podkopaev, P. Bloebaum, S. Kasiviswanathan) , Intl. Conf. on Machine Learning (ICML), 2023 arXiv

Risk-limiting financial audits via weighted sampling without replacement (with S. Shekhar, Z. Xu, Z. Lipton, P. Liang), Intl. Conf. Uncertainty in AI (UAI), 2023 arXiv | TLDR We introduce the notion of risk-limiting financial audits (RLFA), where the goal is to design statistical procedures to verify an assertion about a set of reported financial transactions. We propose a general RLFA strategy using confidence sequences constructed with weighted sampling without replacement, and also develop techniques that can incorporate any available side information (such as predictions from AI models).

Huber-robust confidence sequences (with H. Wang) , Intl. Conf. on AI and Statistics (AISTATS), 2023, arXiv (full oral talk) | TLDR Under a slight generalization of Huber's epsilon-contamination model (where epsilon fraction of the points are arbitrarily corrupted), we derive confidence sequences for univariate means only assuming a finite p-th moment (for p between 1 and 2), which are minimax optimal and perform very well in practice.

Catoni-style confidence sequences for heavy-tailed mean estimation (with H. Wang) , Stochastic Processes and Applications, 2023 arXiv | article | code | TLDR We derive confidence sequences, which are confidence intervals valid at arbitrary stopping times, for univariate means only assuming a finite p-th moment (for p between 1 and 2), which are minimax optimal and perform very well in practice.

Anytime-valid confidence sequences in an enterprise A/B testing platform (with A. Maharaj, R. Sinha, D. Arbour, I. Waudby-Smith, S. Liu, M. Sinha, R. Addanki, M. Garg, V. Swaminathan) , ACM Web Conference (WWW), 2023 arXiv

Dimension-agnostic inference using cross U-statistics (with I. Kim), Bernoulli, 2023 arXiv | proc | TLDR We introduce dimension-agnostic inference, which is a novel approach for high-dimensional inference that ensures asymptotic validity regardless of how the dimension and sample size scale, while preserving minimax optimal power across diverse scenarios; our main tool is a cross U-statistic, which drops half of the terms of a degenerate U-statistic to yield a limiting Gaussian distribution.

On the power of conditional independence testing under model-X (with E. Katsevich), Electronic J. Stat, 2023 arXiv | article

Permutation tests using arbitrary permutation distributions (with R. Barber, E. Candes, R. Tibshirani), Sankhya A, 2023 arXiv | article

Conformal prediction beyond exchangeability (with R. Barber, E. Candes, R. Tibshirani), Annals of Stat., 2023 arXiv | article

Faster online calibration without randomization: interval forecasts and the power of two choices (with C. Gupta), Conf. on Learning Theory (COLT), 2022 arXiv | article

Top-label calibration and multiclass-to-binary reductions (with C. Gupta), Intl. Conf. on Learning Representations, 2022 arXiv | article

Gaussian universal likelihood ratio testing (with R. Dunn, S. Balakrishnan, L. Wasserman), Biometrika, 2022 arXiv | article | TLDR Under a Gaussian setting, we present the first in-depth exploration of the size, power, and relationships between several universal inference variants. We find that in this setting, the power of universal inference has the same behavior in n, d, alpha and SNR as the classical Wilks' Theorem approach, only losing in a small constant of about 2.

A permutation-free kernel two sample test (with S. Shekhar, I. Kim), Conf. on Neural Information Processing Systems (NeurIPS), 2022 arXiv | article | code | (oral talk) | TLDR We propose a new kernel-MMD statistic that drops half the terms of the original, and show that it has a standard normal limiting null distribution in low and high dimensional regimes. This results in a test that is easy to calibrate, that is up to two orders of magnitude faster than running the permutation test, at the price of a small ($\approx \sqrt{2}$ in effective sample size) reduction in power.

Testing exchangeability: fork-convexity, supermartingales, and e-processes (with J. Ruf, M. Larsson, W. Koolen). Intl J. of Approximate Reasoning, 2022 arXiv | article

Tracking the risk of a deployed model and detecting harmful distribution shifts (with A. Podkopaev). Intl. Conf. on Learning Representations (ICLR), 2022 arXiv | article

Brownian noise reduction: maximizing privacy subject to accuracy constraints (with J. Whitehouse, Z.S. Wu, R. Rogers), Conf. on Neural Information Processing Systems (NeurIPS), 2022 arXiv | article

Sequential estimation of quantiles with applications to A/B-testing and best-arm identification (with S. Howard), Bernoulli, 2022 arXiv | article | code

Brainprints: identifying individuals from magnetoencephalograms (with S. Wu, L. Wehbe), Nature Communications Biology, 2022 bioRxiv | article

Interactive rank testing by betting (with B. Duan, L. Wasserman), Conf. on Causal Learning and Reasoning (CLEAR), 2022 arXiv | article (oral talk)

Large-scale simultaneous inference under dependence (with J. Tian, X. Chen, E. Katsevich, J. Goeman), Scandanavian J of Stat., 2022 arXiv | article

False discovery rate control with e-values (with R. Wang), J. of the Royal Stat. Soc., Series B, 2022 arXiv | article

Nested conformal prediction and quantile out-of-bag ensemble methods (with C. Gupta, A. Kuchibhotla), Pattern Recognition, 2022 arXiv | article | code

Distribution-free prediction sets for two-layer hierarchical models (with R. Dunn, L. Wasserman), J of American Stat. Assoc., 2022 arXiv | article | code | TLDR Conformal methods typically rely on exchangeable data to provide valid prediction sets in finite samples, but we extend conformal methods to construct prediction sets in a nonexchangeable two-layer hierarchical setting, where N groups of data are exchangeable, and the observations within each group are also exchangeable.

Fast and powerful conditional randomization testing via distillation (with M. Liu, E. Katsevich, L. Janson), Biometrika, 2021 arXiv | article | code

Uncertainty quantification using martingales for misspecified Gaussian processes (with W. Neiswanger), Algorithmic Learning Theory (ALT), 2021 arXiv | article | code | talk

RiLACS: Risk-limiting audits via confidence sequences (with I. Waudby-Smith, P. Stark), Intl. Conf. for Electronic Voting (EVoteID), 2021 arXiv | article | code (Best Paper award)

Predictive inference with the jackknife+ (with R. Barber, E. Candes, R. Tibshirani), Annals of Stat., 2021 arXiv | article | code

Path length bounds for gradient descent and flow (with C. Gupta, S. Balakrishnan), J. of Machine Learning Research, 2021 arXiv | article | blog

Nonparametric iterated-logarithm extensions of the sequential generalized likelihood ratio test (with J. Shin, A. Rinaldo), IEEE J. on Selected Areas in Info. Theory, 2021 arXiv | article

Time-uniform, nonparametric, nonasymptotic confidence sequences (with S. Howard, J. Sekhon, J. McAuliffe), The Annals of Stat., 2021 arXiv | article | code | tutorial

Off-policy confidence sequences (with N. Karampatziakis, P. Mineiro), Intl. Conf. on Machine Learning (ICML), 2021 arXiv | article

Best arm identification under additive transfer bandits (with O. Neopane, A. Singh), Asilomar Conf. on Signals, Systems and Computers, 2021 arXiv | article (Best Student Paper award)

On the bias, risk and consistency of sample means in multi-armed bandits (with J. Shin, A. Rinaldo), SIAM J. on the Math. of Data Science, 2021 arXiv | article | talk

Dynamic algorithms for online multiple testing (with Z. Xu), Conf. on Math. and Scientific Machine Learning, 2021 arXiv | article | talk | slides | code | TLDR We develop the first practically powerful algorithms that provably controls the supremum of the false discovery proportion with high probability in online multiple testing.

Online control of the familywise error rate (with J. Tian), Statistical Methods in Medical Research, 2021 arXiv | article

Asynchronous online testing of multiple hypotheses (with T. Zrnic, M. Jordan), J. of Machine Learning Research, 2021 arXiv | article | code | blog

Classification accuracy as a proxy for two sample testing (with I. Kim, A. Singh, L. Wasserman), Annals of Stat., 2021 arXiv | article | (JSM Stat Learning Student Paper Award) | TLDR We explore the use of classification accuracy for two-sample testing for general classifiers, proving in particular that the accuracy test based on Fisher's LDA achieves minimax rate-optimal power and establishing conditions for consistency based on general classifiers.

Distribution-free calibration guarantees for histogram binning without sample splitting (with C. Gupta), Intl. Conf. on Machine Learning, 2021 arXiv | article

Distribution-free uncertainty quantification for classification under label shift (with A. Podkopaev), Conf. on Uncertainty in AI, 2021 arXiv | article

Distribution-free binary classification: prediction sets, confidence intervals and calibration (with C. Gupta, A. Podkopaev), Conf. on Neural Information Processing Systems (NeurIPS), 2020 arXiv | article (spotlight talk)

The limits of distribution-free conditional predictive inference (with R. Barber, E. Candes, R. Tibshirani), Information and Inference, 2020 arXiv | article

Analyzing student strategies in blended courses using clickstream data (with N. Akpinar, U. Acar), Educational Data Mining, 2020 arXiv | article | talk (oral talk)

The power of batching in multiple hypothesis testing (with T. Zrnic, D. Jiang, M. Jordan), Intl. Conf. on AI and Statistics, 2020 arXiv | article | talk

Online control of the false coverage rate and false sign rate (with A. Weinstein), Intl. Conf. on Machine Learning (ICML), 2020 arXiv | article

Confidence sequences for sampling without replacement (with I. Waudby-Smith), Conf. on Neural Information Processing Systems (NeurIPS), 2020 arXiv | article | code (spotlight talk)

Universal inference (with L. Wasserman, S. Balakrishnan), Proc. of the National Academy of Sciences, 2020 arXiv | article | talk

A unified framework for bandit multiple testing (with Z. Xu, R. Wang), Conf. on Neural Information Processing Systems (NeurIPS), 2020 arXiv | article | talk | slides | code | TLDR Using e-values (or e-processes) and the e-BH procedure, we formulate a framework which provides false discovery rate (FDR) control at any stopping time for multiple testing in the bandit setting, that is robust to the dependencies induced by the user’s sampling and stopping policies.

Simultaneous high-probability bounds on the FDP in structured, regression and online settings (with E. Katsevich), Annals of Stat., 2020 arXiv | article | code

Time-uniform Chernoff bounds via nonnegative supermartingales (with S. Howard, J. Sekhon, J. McAuliffe), Prob. Surveys, 2020 arXiv | article | talk

STAR: A general interactive framework for FDR control under structural constraints (with L. Lei, W. Fithian), Biometrika, 2020 arXiv | article | poster | code

Familywise error rate control by interactive unmasking (with B. Duan, L. Wasserman), Intl. Conf. on Machine Learning (ICML), 2020 arXiv | article | code

Interactive martingale tests for the global null (with B. Duan, S. Balakrishnan, L. Wasserman), Electronic J. of Stat., 2020 arXiv | article | code

On conditional versus marginal bias in multi-armed bandits (with J. Shin, A. Rinaldo), Intl. Conf. on Machine Learning (ICML), 2020 arXiv | article

Are sample means in multi-armed bandits positively or negatively biased? (with J. Shin, A. Rinaldo), Conf. on Neural Information Processing Systems (NeurIPS), 2019 arXiv | article | poster

A higher order Kolmogorov-Smirnov test (with V. Sadhanala, Y. Wang, R. Tibshirani), Intl. Conf. on AI and Statistics, 2019 arXiv | article

ADDIS: an adaptive discarding algorithm for online FDR control with conservative nulls (with J. Tian), Conf. on Neural Information Processing Systems (NeurIPS), 2019 arXiv | code | article

A unified treatment of multiple testing with prior knowledge using the p-filter (with R. F. Barber, M. Wainwright, M. Jordan), Annals of Stat., 2019 arXiv | article | code

DAGGER: A sequential algorithm for FDR control on DAGs (with J. Chen, M. Wainwright, M. Jordan), Biometrika, 2019 arXiv | article | code

Conformal prediction under covariate shift (with R. Tibshirani, R. Barber, E. Candes), Conf. on Neural Information Processing Systems (NeurIPS), 2019 arXiv | article | poster

Optimal rates and tradeoffs in multiple testing (with M. Rabinovich, M. Wainwright, M. Jordan), Statistica Sinica, 2019 arXiv | article | poster

Function-specific mixing times and concentration away from equilibrium (with M. Rabinovich, M. Wainwright, M. Jordan), Bayesian Analysis, 2019 arXiv | article | poster

Decoding from pooled data (II): sharp information-theoretic bounds (with A. El-Alaoui, F. Krzakala, L. Zdeborova, M. Jordan), SIAM J. on Math. of Data Science, 2019 arXiv | article

Decoding from pooled data (I): phase transitions of message passing (with A. El-Alaoui, A. Ramdas, F. Krzakala, L. Zdeborova, M. Jordan), IEEE Trans. on Info. Theory, 2018 arXiv | article

On the power of online thinning in reducing discrepancy (with R. Dwivedi, O. N. Feldheim, Ori Gurel-Gurevich), Prob. Theory and Related Fields, 2018 arXiv | article | poster

On kernel methods for covariates that are rankings (with H. Mania, M. Wainwright, M. Jordan, B. Recht), Electronic J. of Stat., 2018 arXiv | article

SAFFRON: an adaptive algorithm for online FDR control (with T. Zrnic, M. Wainwright, M. Jordan), Intl. Conf. on Machine Learning (ICML), 2018 arXiv | article | code (full oral talk)

Online control of the false discovery rate with decaying memory (with F. Yang, M. Wainwright, M. Jordan), Conf. on Neural Information Processing Systems (NeurIPS), 2017 arXiv | article | poster | talk (from 44:00) (full oral talk)

MAB-FDR: Multi (A)rmed\/(B)andit testing with online FDR control (with F. Yang, K. Jamieson, M. Wainwright), Conf. on Neural Information Processing Systems (NeurIPS), 2017 arXiv | article | code (spotlight talk)

QuTE: decentralized FDR control on sensor networks (with J. Chen, M. Wainwright, M. Jordan), IEEE Conf. on Decision and Control, 2017 arXiv | article | code | poster

Iterative methods for solving factorized linear systems (with A. Ma, D. Needell), SIAM J. on Matrix Analysis and Applications, 2017 arXiv | article

Rows vs. columns : randomized Kaczmarz or Gauss-Seidel for ridge regression (with A. Hefny, D. Needell), SIAM J. on Scientific Computing, 2017 arXiv | article

On Wasserstein two sample testing and related families of nonparametric tests (with N. Garcia, M. Cuturi), Entropy, 2017 arXiv | article

Generative models and model criticism via optimized maximum mean discrepancy (with D. Sutherland, H. Tung, H. Strathmann, S. De, A. Smola, A. Gretton), Intl. Conf. on Learning Representations, 2017 arXiv | article | poster | code

Minimax lower bounds for linear independence testing (with D. Isenberg, A. Singh, L. Wasserman), IEEE Intl. Symp. on Information Theory, 2016 arXiv | article

p-filter: multi-layer FDR control for grouped hypotheses (with COAUTHORS), J. of the Royal Stat. Society, Series B, 2016 arXiv | article | code | poster

Sequential nonparametric testing with the law of the iterated logarithm (with A. Balsubramani), Conf. on Uncertainty in AI, 2016 arXiv | article | errata

Asymptotic behavior of Lq-based Laplacian regularization in semi-supervised learning (with A. El-Alaoui, X. Cheng, M. Wainwright, M. Jordan), Conf. on Learning Theory, 2016 arXiv | article

Regularized brain reading with shrinkage and smoothing (with L. Wehbe, R. Steorts, C. Shalizi), Annals of Applied Stat., 2015 arXiv | article

On the high-dimensional power of a linear-time two sample test under mean-shift alternatives (with S. Reddi, A. Singh, B. Poczos, L. Wasserman), Intl. Conf. on AI and Statistics, 2015 arXiv | article | errata

On the decreasing power of kernel and distance based nonparametric hypothesis tests in high dimensions (with S. Reddi\*, B. Poczos, A. Singh, L. Wasserman), AAAI Conf. on Artificial Intelligence, 2015 arXiv | article | supp

Fast two-sample testing with analytic representations of probability measures (with K. Chwialkowski, D. Sejdinovic, A. Gretton), Conf. on Neural Information Processing Systems (NeurIPS), 2015 arXiv | article | code

Nonparametric independence testing for small sample sizes (with L. Wehbe), Intl. Joint Conf. on AI, 2015 arXiv | article (oral talk)

Convergence properties of the randomized extended Gauss-Seidel and Kaczmarz methods (with A. Ma, D. Needell), SIAM J. on Matrix Analysis and Applications, 2015 arXiv | article | code

Fast & flexible ADMM algorithms for trend filtering (with R. Tibshirani), J. of Computational and Graphical Statistics, 2015 arXiv | article | talk | code

Towards a deeper geometric, analytic and algorithmic understanding of margins (with J. Pena), Opt. Methods and Software, 2015 arXiv | article

Margins, kernels and non-linear smoothed perceptrons (with J. Pena), Intl. Conf. on Machine Learning (ICML), 2014 arXiv | article | poster | talk oral talk

Simultaneously uncovering the patterns of brain regions involved in different story reading subprocesses (with L. Wehbe, B. Murphy, P. Talukdar, A. Fyshe, T. Mitchell), PLoS ONE, 2014 website | article

An analysis of active learning with uniform feature noise (with A. Singh, L. Wasserman, B. Poczos), Intl. Conf. on AI and Statistics, 2014 arXiv | article | poster | talk (oral talk)

Algorithmic connections between active learning and stochastic convex optimization (with A. Singh), Conf. on Algorithmic Learning Theory (ALT), 2013 arXiv | article | poster

Optimal rates for stochastic convex optimization under Tsybakov's noise condition (with A. Singh), Intl. Conf. on Machine Learning (ICML), 2013 arXiv | article | poster | talk (oral talk)

Adaptivity & computation-statistics tradeoffs for kernel & distance based high-dimensional two sample testing (with S. Reddi, B. Poczos, A. Singh, L. Wasserman).       arXiv | poster

Algorithms for graph similarity and subgraph matching (with D. Koutra, A. Parikh, J. Xiang).       report

Undergraduate Catalog

Department of statistics and data science courses, about course numbers:.

Each Carnegie Mellon course number begins with a two-digit prefix that designates the department offering the course (i.e., 76-xxx courses are offered by the Department of English). Although each department maintains its own course numbering practices, typically, the first digit after the prefix indicates the class level: xx-1xx courses are freshmen-level, xx-2xx courses are sophomore level, etc. Depending on the department, xx-6xx courses may be either undergraduate senior-level or graduate-level, and xx-7xx courses and higher are graduate-level. Consult the Schedule of Classes each semester for course offerings and for any necessary pre-requisites or co-requisites.

Print Options

Send Page to Printer

Print this page.

Download Page (PDF)

The PDF will include all information unique to this page.

Carnegie Mellon University School of Computer Science

Machine learning department.

carnegie mellon statistics phd

Ph.D. in Machine Learning

Machine learning is dedicated to furthering scientific understanding of automated learning and to producing the next generation of tools for data analysis and decision-making based on that understanding. The doctoral program in machine learning trains students to become tomorrow's leaders in this rapidly growing area.

Joint Ph.D. in Machine Learning and Public Policy

The Joint Ph.D. Program in Machine Learning and Public Policy is a new program for students to gain the skills necessary to develop state-of-the-art machine learning technologies and apply these technologies to real-world policy issues.

Joint Ph.D. in Neural Computation and Machine Learning

This Ph.D. program trains students in the application of machine learning to neuroscience by combining core elements of the machine learning Ph.D. program and the Ph.D. in neural computation offered by the Center for the Neural Basis of Cognition.

Joint Ph.D. in Statistics and Machine Learning

This joint program prepares students for academic careers in both computer science and statistics departments at top universities. Students in this track will be involved in courses and research from both the Department of Statistics and the Machine Learning Department.

Visit the Website

  • Back to Doctoral Programs

More Information

RESEARCH AT THE LEADING EDGE

Doctoral Studies in Public Policy & Management

Ph.D. Studies in Public Policy & Management

The doctoral program in Public Policy & Management at Carnegie Mellon University's Heinz College prepares students to apply a rigorous scientific approach to social, organizational, economic, and management problems in an increasingly connected world.

At Heinz, we live and work at the critical nexus of information technology and public policy. Our Ph.D. in Public Policy & Management was created to train students to approach problems from multiple disciplinary perspectives, to use advanced analytic and theoretical models, and to apply modern technological capabilities such as machine learning to the policy domain.

Heinz College Ph.D. students enjoy close partnerships with faculty as they explore the complex and exciting interconnectedness of information systems, public policy, and management. Upon graduating, our Ph.D.s receive desirable placements at academic institutions, government agencies, and consulting firms.

KEY RESEARCH AREAS

Doctoral students take on a broad range of topics and problems, but some key areas of strength at Heinz College include:

Analyzing and designing practical crime and drug policies is highly challenging. Many of our faculty have been working with sophisticated statistical and policy tools to understand the impact of various state and federal policies on crime rates and the sale of drugs. The research then leads to recommendations on how policies should be designed and updated.

Our faculty is widely acclaimed for their work and influence in this domain and have won top academic honors. Key faculty members are: Al Blumstein , Jon Caulkins , Amelia Haviland , and Daniel Nagin .

The functioning of energy markets and the consequences of environmental regulations have been widely debated in many countries around the world. Our faculty has been actively working on issues related to innovation in the energy sector, impact of energy infrastructure and energy production on local development and pollution, effects of air and water pollution on health outcomes, and the costs and benefits of environmental regulations.

Key faculty members are: Lee Branstetter , Karen Clay , Akshaya Jha , and Edson Severnini . 

CMU's Scott Institute for Energy Innovation  also has a number of faculty working on a broad array of topics related to energy and the environment.

The importance of health care policy cannot be overstated. Many faculty members are working on important role of competition, technology, and regulations that affect the cost and quality of health care delivery.

Our faculty is widely recognized for their academic excellence as well as influence on policy-making. Some key faculty are Martin Gaynor , Amelia Haviland , Rema Padman , and Lowell Taylor .

We have a strong group of faculty in the Management Science domain who use optimization/operations research techniques to solve public policy problems.

In particular, faculty such as Al Blumstein , Jon Caulkins , and Ramayya Krishnan   use these techniques to address problems related to urban planning and transportation research, policy analysis, data mining, and more.

This group is focused on understanding the role of social networks within organizations, the role of teams, and evidence-based analysis. Our faculty includes experts on network-based analysis who examine how influence diffuses. 

David Krackhardt uses his expertise in sociology, economics, and statistics to examine these issues. 

Denise Rousseau is a leading expert on evidence-based management within organizations.

Ph.D. Curriculum

The pre-dissertation stage of the Ph.D. in Public Policy & Management is structured around two sets of requirements: coursework and preliminary papers.

Coursework is designed to build methodological skills, modeling competence, and substantive depth.

Preliminary papers illustrate your ability to produce effective research that exhibits your readiness to begin the dissertation.

  • A three-semester   Ph.D. Seminar Series   focusing on the research process
  • Two semesters of   Advanced Electives offering depth in specialized fields
  • Quantitative Methods Cluster   of courses in statistics, econometrics, and machine learning
  • Two semesters of coursework in   Social and Policy Sciences
  • Concentration Area Requirement , combining research and courses to support your research agenda and long-term professional objectives

Admission to candidacy means that all requirements of the Ph.D. program preliminary to the dissertation have been fulfilled. In addition to satisfying all coursework requirements, you must also meet the following research requirements:

  • First- and second-year   Research Papers   meeting current Ph.D. requirements
  • Dissertation focused on Public Policy   topic as per judgment of Ph.D. committee

While fulfilling these requirements, you'll work closely with the faculty to develop individualized programs of study and research that meet your goals.

Electrical and Computer Engineering

College of engineering, phd  in ece.

Students in the ECE PhD  program are provided  with  a research-intensive study of the fundamentals of electrical and computer engineering. Students will create and disseminate knowledge of electrical and computer systems during the course of obtaining the PhD  degree. Upon enrollment in the department, students, with the help of a faculty advisor, define an education and research program consistent with their background and best-suited to their own academic goals.

Other programs

Center for the neural basis of cognition (cnbc).

ECE PhD  students interested in the neural basis of cognition can apply to the   CNBC's Graduate Training Program, which allows students to combine neuroscience and engineering in an interdisciplinary training program.

The Information & Communication Technologies Institute at Carnegie Mellon (ICTI@CMU)

The Information and Communication Technologies Institute is an international "virtual" institution with poles in Portugal (ICTI@Portugal) and at Carnegie Mellon University (ICTI@CMU). Interested students may apply for doctoral study in ECE through the ICTI. Those accepted will take courses that have been approved by the Carnegie Mellon ECE Department at a partner institution in Portugal.

ECE/Thailand

Established as a collaboration between Carnegie Mellon University and King Mongkut's Institute of Technology Ladkrabang (KMITL), ECE/Thailand will provide cutting-edge engineering research and education in Southeast Asia.

Helpful links

To apply, select  Master's  or  Doctoral  from the dropdown, then select  Electrical and Computer Engineering . The list of available programs will then display. You may select up to three programs for which to be considered.

Visit Pittsburgh

Pittsburgh handbook

Silicon Valley handbook

Teaching assistant assignments

Popular Searches

  • Master’s of AI Engineering
  • Engineering Magazine
  • Covid updates
  • Manufacturing Futures Institute
  • student organizations
  • Rethink the Rink

Social Media

  • @CMUEngineering
  • CMUEngineering
  • College of Engineering
  • Graduate studies

Graduate admissions

The College of Engineering provides a diverse array of exceptional master’s and Ph.D. programs through its nine departments and institutes. On this page, you will find links to learn about admission requirements, the application process, financial aid options, how to request information, and other important details.

Admission requirements by department/institute:

Use the links below to find admission requirements and application windows for your department, institute, or interdisciplinary program of interest:

  • Biomedical Engineering
  • Chemical Engineering
  • Civil and Environmental Engineering
  • Electrical and Computer Engineering
  • Engineering and Public Policy
  • Information Networking Institute
  • Integrated Innovation Institute
  • Materials Science and Engineering
  • Mechanical Engineering
  • CMU-Silicon Valley
  • Energy Science, Technology, and Policy
  • Engineering and Technology Innovation Management
  • Apply to our graduate programs
  • College of Engineering rankings
  • Financial support
  • Request information
  • Graduate student information session materials
  • Contact the Office of Graduate Affairs

carnegie mellon statistics phd

Societal Computing

Software and societal systems department, program plan, phd in societal computing.

The Doctor of Philosophy is a full-time residential program. The PhD is appropriate for candidates who want to become independent academic researchers, for-profit or social entrepreneurs, or applied research leaders in industry. 

Program Structure

Every Year:  Students become active participants in ongoing research projects from day 1. We firmly believe that in order to become mature, independent, world class researchers, students must begin acquiring the hands-on skills and knowledge right away.

Years One and Two:  Students complete the bulk of their coursework in the first two years, while continuing to spend about half of their time working in a research project with their advisors, and often with other faculty and students.

Year Three:  Usually in the third year, students form a thesis committee, chaired by their advisor, that is carefully selected to provide the combination of expertise that can most effectively guide the student through a dissertation. The student selects a thesis topic, and develops a proposal in consultation with the committee, and defends it in a public session. On passing, and completing all course, TA, and skill requirements, the student attains Candidate status.

Year Four or Five:  The student completes the dissertation research , writes up the thesis, and defends it publicly. After addressing any concerns and comments the committee may have, all requirements are complete and the student becomes our newest Societal Computing PhD.

Course of Study

The Societal Computing curriculum ensures all students have a solid foundation in computational methods, society and organizations, and policy. At the same time, it retains enough flexibility to enable students to further specialize in areas that are more closely relevant to their research interests.

Students are required to take two (2) semesters of Societal Computing Practicum, four (4) Star Courses, three (3) elective courses, and to serve as a Teaching Assistant for two (2) semesters. For more information on these policies, please refer to the Course Requirements section of the Societal Computing PhD Student Handbook on our   Current Students' page .

The following is an illustration of the courses two different sets of students could take in their first three years:

Sample Schedule 2:

Machine Learning - CMU

PhD Dissertations

PhD Dissertations

[all are .pdf files].

Learning Models that Match Jacob Tyo, 2024

Improving Human Integration across the Machine Learning Pipeline Charvi Rastogi, 2024

Reliable and Practical Machine Learning for Dynamic Healthcare Settings Helen Zhou, 2023

Automatic customization of large-scale spiking network models to neuronal population activity (unavailable) Shenghao Wu, 2023

Estimation of BVk functions from scattered data (unavailable) Addison J. Hu, 2023

Rethinking object categorization in computer vision (unavailable) Jayanth Koushik, 2023

Advances in Statistical Gene Networks Jinjin Tian, 2023 Post-hoc calibration without distributional assumptions Chirag Gupta, 2023

The Role of Noise, Proxies, and Dynamics in Algorithmic Fairness Nil-Jana Akpinar, 2023

Collaborative learning by leveraging siloed data Sebastian Caldas, 2023

Modeling Epidemiological Time Series Aaron Rumack, 2023

Human-Centered Machine Learning: A Statistical and Algorithmic Perspective Leqi Liu, 2023

Uncertainty Quantification under Distribution Shifts Aleksandr Podkopaev, 2023

Probabilistic Reinforcement Learning: Using Data to Define Desired Outcomes, and Inferring How to Get There Benjamin Eysenbach, 2023

Comparing Forecasters and Abstaining Classifiers Yo Joong Choe, 2023

Using Task Driven Methods to Uncover Representations of Human Vision and Semantics Aria Yuan Wang, 2023

Data-driven Decisions - An Anomaly Detection Perspective Shubhranshu Shekhar, 2023

Applied Mathematics of the Future Kin G. Olivares, 2023

METHODS AND APPLICATIONS OF EXPLAINABLE MACHINE LEARNING Joon Sik Kim, 2023

NEURAL REASONING FOR QUESTION ANSWERING Haitian Sun, 2023

Principled Machine Learning for Societally Consequential Decision Making Amanda Coston, 2023

Long term brain dynamics extend cognitive neuroscience to timescales relevant for health and physiology Maxwell B. Wang, 2023

Long term brain dynamics extend cognitive neuroscience to timescales relevant for health and physiology Darby M. Losey, 2023

Calibrated Conditional Density Models and Predictive Inference via Local Diagnostics David Zhao, 2023

Towards an Application-based Pipeline for Explainability Gregory Plumb, 2022

Objective Criteria for Explainable Machine Learning Chih-Kuan Yeh, 2022

Making Scientific Peer Review Scientific Ivan Stelmakh, 2022

Facets of regularization in high-dimensional learning: Cross-validation, risk monotonization, and model complexity Pratik Patil, 2022

Active Robot Perception using Programmable Light Curtains Siddharth Ancha, 2022

Strategies for Black-Box and Multi-Objective Optimization Biswajit Paria, 2022

Unifying State and Policy-Level Explanations for Reinforcement Learning Nicholay Topin, 2022

Sensor Fusion Frameworks for Nowcasting Maria Jahja, 2022

Equilibrium Approaches to Modern Deep Learning Shaojie Bai, 2022

Towards General Natural Language Understanding with Probabilistic Worldbuilding Abulhair Saparov, 2022

Applications of Point Process Modeling to Spiking Neurons (Unavailable) Yu Chen, 2021

Neural variability: structure, sources, control, and data augmentation Akash Umakantha, 2021

Structure and time course of neural population activity during learning Jay Hennig, 2021

Cross-view Learning with Limited Supervision Yao-Hung Hubert Tsai, 2021

Meta Reinforcement Learning through Memory Emilio Parisotto, 2021

Learning Embodied Agents with Scalably-Supervised Reinforcement Learning Lisa Lee, 2021

Learning to Predict and Make Decisions under Distribution Shift Yifan Wu, 2021

Statistical Game Theory Arun Sai Suggala, 2021

Towards Knowledge-capable AI: Agents that See, Speak, Act and Know Kenneth Marino, 2021

Learning and Reasoning with Fast Semidefinite Programming and Mixing Methods Po-Wei Wang, 2021

Bridging Language in Machines with Language in the Brain Mariya Toneva, 2021

Curriculum Learning Otilia Stretcu, 2021

Principles of Learning in Multitask Settings: A Probabilistic Perspective Maruan Al-Shedivat, 2021

Towards Robust and Resilient Machine Learning Adarsh Prasad, 2021

Towards Training AI Agents with All Types of Experiences: A Unified ML Formalism Zhiting Hu, 2021

Building Intelligent Autonomous Navigation Agents Devendra Chaplot, 2021

Learning to See by Moving: Self-supervising 3D Scene Representations for Perception, Control, and Visual Reasoning Hsiao-Yu Fish Tung, 2021

Statistical Astrophysics: From Extrasolar Planets to the Large-scale Structure of the Universe Collin Politsch, 2020

Causal Inference with Complex Data Structures and Non-Standard Effects Kwhangho Kim, 2020

Networks, Point Processes, and Networks of Point Processes Neil Spencer, 2020

Dissecting neural variability using population recordings, network models, and neurofeedback (Unavailable) Ryan Williamson, 2020

Predicting Health and Safety: Essays in Machine Learning for Decision Support in the Public Sector Dylan Fitzpatrick, 2020

Towards a Unified Framework for Learning and Reasoning Han Zhao, 2020

Learning DAGs with Continuous Optimization Xun Zheng, 2020

Machine Learning and Multiagent Preferences Ritesh Noothigattu, 2020

Learning and Decision Making from Diverse Forms of Information Yichong Xu, 2020

Towards Data-Efficient Machine Learning Qizhe Xie, 2020

Change modeling for understanding our world and the counterfactual one(s) William Herlands, 2020

Machine Learning in High-Stakes Settings: Risks and Opportunities Maria De-Arteaga, 2020

Data Decomposition for Constrained Visual Learning Calvin Murdock, 2020

Structured Sparse Regression Methods for Learning from High-Dimensional Genomic Data Micol Marchetti-Bowick, 2020

Towards Efficient Automated Machine Learning Liam Li, 2020

LEARNING COLLECTIONS OF FUNCTIONS Emmanouil Antonios Platanios, 2020

Provable, structured, and efficient methods for robustness of deep networks to adversarial examples Eric Wong , 2020

Reconstructing and Mining Signals: Algorithms and Applications Hyun Ah Song, 2020

Probabilistic Single Cell Lineage Tracing Chieh Lin, 2020

Graphical network modeling of phase coupling in brain activity (unavailable) Josue Orellana, 2019

Strategic Exploration in Reinforcement Learning - New Algorithms and Learning Guarantees Christoph Dann, 2019 Learning Generative Models using Transformations Chun-Liang Li, 2019

Estimating Probability Distributions and their Properties Shashank Singh, 2019

Post-Inference Methods for Scalable Probabilistic Modeling and Sequential Decision Making Willie Neiswanger, 2019

Accelerating Text-as-Data Research in Computational Social Science Dallas Card, 2019

Multi-view Relationships for Analytics and Inference Eric Lei, 2019

Information flow in networks based on nonstationary multivariate neural recordings Natalie Klein, 2019

Competitive Analysis for Machine Learning & Data Science Michael Spece, 2019

The When, Where and Why of Human Memory Retrieval Qiong Zhang, 2019

Towards Effective and Efficient Learning at Scale Adams Wei Yu, 2019

Towards Literate Artificial Intelligence Mrinmaya Sachan, 2019

Learning Gene Networks Underlying Clinical Phenotypes Under SNP Perturbations From Genome-Wide Data Calvin McCarter, 2019

Unified Models for Dynamical Systems Carlton Downey, 2019

Anytime Prediction and Learning for the Balance between Computation and Accuracy Hanzhang Hu, 2019

Statistical and Computational Properties of Some "User-Friendly" Methods for High-Dimensional Estimation Alnur Ali, 2019

Nonparametric Methods with Total Variation Type Regularization Veeranjaneyulu Sadhanala, 2019

New Advances in Sparse Learning, Deep Networks, and Adversarial Learning: Theory and Applications Hongyang Zhang, 2019

Gradient Descent for Non-convex Problems in Modern Machine Learning Simon Shaolei Du, 2019

Selective Data Acquisition in Learning and Decision Making Problems Yining Wang, 2019

Anomaly Detection in Graphs and Time Series: Algorithms and Applications Bryan Hooi, 2019

Neural dynamics and interactions in the human ventral visual pathway Yuanning Li, 2018

Tuning Hyperparameters without Grad Students: Scaling up Bandit Optimisation Kirthevasan Kandasamy, 2018

Teaching Machines to Classify from Natural Language Interactions Shashank Srivastava, 2018

Statistical Inference for Geometric Data Jisu Kim, 2018

Representation Learning @ Scale Manzil Zaheer, 2018

Diversity-promoting and Large-scale Machine Learning for Healthcare Pengtao Xie, 2018

Distribution and Histogram (DIsH) Learning Junier Oliva, 2018

Stress Detection for Keystroke Dynamics Shing-Hon Lau, 2018

Sublinear-Time Learning and Inference for High-Dimensional Models Enxu Yan, 2018

Neural population activity in the visual cortex: Statistical methods and application Benjamin Cowley, 2018

Efficient Methods for Prediction and Control in Partially Observable Environments Ahmed Hefny, 2018

Learning with Staleness Wei Dai, 2018

Statistical Approach for Functionally Validating Transcription Factor Bindings Using Population SNP and Gene Expression Data Jing Xiang, 2017

New Paradigms and Optimality Guarantees in Statistical Learning and Estimation Yu-Xiang Wang, 2017

Dynamic Question Ordering: Obtaining Useful Information While Reducing User Burden Kirstin Early, 2017

New Optimization Methods for Modern Machine Learning Sashank J. Reddi, 2017

Active Search with Complex Actions and Rewards Yifei Ma, 2017

Why Machine Learning Works George D. Montañez , 2017

Source-Space Analyses in MEG/EEG and Applications to Explore Spatio-temporal Neural Dynamics in Human Vision Ying Yang , 2017

Computational Tools for Identification and Analysis of Neuronal Population Activity Pengcheng Zhou, 2016

Expressive Collaborative Music Performance via Machine Learning Gus (Guangyu) Xia, 2016

Supervision Beyond Manual Annotations for Learning Visual Representations Carl Doersch, 2016

Exploring Weakly Labeled Data Across the Noise-Bias Spectrum Robert W. H. Fisher, 2016

Optimizing Optimization: Scalable Convex Programming with Proximal Operators Matt Wytock, 2016

Combining Neural Population Recordings: Theory and Application William Bishop, 2015

Discovering Compact and Informative Structures through Data Partitioning Madalina Fiterau-Brostean, 2015

Machine Learning in Space and Time Seth R. Flaxman, 2015

The Time and Location of Natural Reading Processes in the Brain Leila Wehbe, 2015

Shape-Constrained Estimation in High Dimensions Min Xu, 2015

Spectral Probabilistic Modeling and Applications to Natural Language Processing Ankur Parikh, 2015 Computational and Statistical Advances in Testing and Learning Aaditya Kumar Ramdas, 2015

Corpora and Cognition: The Semantic Composition of Adjectives and Nouns in the Human Brain Alona Fyshe, 2015

Learning Statistical Features of Scene Images Wooyoung Lee, 2014

Towards Scalable Analysis of Images and Videos Bin Zhao, 2014

Statistical Text Analysis for Social Science Brendan T. O'Connor, 2014

Modeling Large Social Networks in Context Qirong Ho, 2014

Semi-Cooperative Learning in Smart Grid Agents Prashant P. Reddy, 2013

On Learning from Collective Data Liang Xiong, 2013

Exploiting Non-sequence Data in Dynamic Model Learning Tzu-Kuo Huang, 2013

Mathematical Theories of Interaction with Oracles Liu Yang, 2013

Short-Sighted Probabilistic Planning Felipe W. Trevizan, 2013

Statistical Models and Algorithms for Studying Hand and Finger Kinematics and their Neural Mechanisms Lucia Castellanos, 2013

Approximation Algorithms and New Models for Clustering and Learning Pranjal Awasthi, 2013

Uncovering Structure in High-Dimensions: Networks and Multi-task Learning Problems Mladen Kolar, 2013

Learning with Sparsity: Structures, Optimization and Applications Xi Chen, 2013

GraphLab: A Distributed Abstraction for Large Scale Machine Learning Yucheng Low, 2013

Graph Structured Normal Means Inference James Sharpnack, 2013 (Joint Statistics & ML PhD)

Probabilistic Models for Collecting, Analyzing, and Modeling Expression Data Hai-Son Phuoc Le, 2013

Learning Large-Scale Conditional Random Fields Joseph K. Bradley, 2013

New Statistical Applications for Differential Privacy Rob Hall, 2013 (Joint Statistics & ML PhD)

Parallel and Distributed Systems for Probabilistic Reasoning Joseph Gonzalez, 2012

Spectral Approaches to Learning Predictive Representations Byron Boots, 2012

Attribute Learning using Joint Human and Machine Computation Edith L. M. Law, 2012

Statistical Methods for Studying Genetic Variation in Populations Suyash Shringarpure, 2012

Data Mining Meets HCI: Making Sense of Large Graphs Duen Horng (Polo) Chau, 2012

Learning with Limited Supervision by Input and Output Coding Yi Zhang, 2012

Target Sequence Clustering Benjamin Shih, 2011

Nonparametric Learning in High Dimensions Han Liu, 2010 (Joint Statistics & ML PhD)

Structural Analysis of Large Networks: Observations and Applications Mary McGlohon, 2010

Modeling Purposeful Adaptive Behavior with the Principle of Maximum Causal Entropy Brian D. Ziebart, 2010

Tractable Algorithms for Proximity Search on Large Graphs Purnamrita Sarkar, 2010

Rare Category Analysis Jingrui He, 2010

Coupled Semi-Supervised Learning Andrew Carlson, 2010

Fast Algorithms for Querying and Mining Large Graphs Hanghang Tong, 2009

Efficient Matrix Models for Relational Learning Ajit Paul Singh, 2009

Exploiting Domain and Task Regularities for Robust Named Entity Recognition Andrew O. Arnold, 2009

Theoretical Foundations of Active Learning Steve Hanneke, 2009

Generalized Learning Factors Analysis: Improving Cognitive Models with Machine Learning Hao Cen, 2009

Detecting Patterns of Anomalies Kaustav Das, 2009

Dynamics of Large Networks Jurij Leskovec, 2008

Computational Methods for Analyzing and Modeling Gene Regulation Dynamics Jason Ernst, 2008

Stacked Graphical Learning Zhenzhen Kou, 2007

Actively Learning Specific Function Properties with Applications to Statistical Inference Brent Bryan, 2007

Approximate Inference, Structure Learning and Feature Estimation in Markov Random Fields Pradeep Ravikumar, 2007

Scalable Graphical Models for Social Networks Anna Goldenberg, 2007

Measure Concentration of Strongly Mixing Processes with Applications Leonid Kontorovich, 2007

Tools for Graph Mining Deepayan Chakrabarti, 2005

Automatic Discovery of Latent Variable Models Ricardo Silva, 2005

carnegie mellon statistics phd

  • Skip to main menu
  • Skip to user menu

Associate Data Scientist

Carnegie Mellon University

Job Details

  • Please visit “ Why Carnegie Mellon ” to learn more about becoming part of an institution inspiring innovations that change the world.
  • Click here to view a listing of employee benefits
  • Carnegie Mellon University is an Equal Opportunity Employer/Disability/Veteran .
  • Statement of Assurance

Carnegie Mellon Univesity

Carnegie Mellon University challenges the curious and passionate to imagine and deliver work that matters.

A private, global research university, Carnegie Mellon stands among the world's most renowned educational institutions, and sets its own course.   Start the journey here .

Over the past 10 years, more than 400 startups linked to CMU have raised more than $7 billion in follow-on funding. Those investment numbers are especially high because of the sheer size of Pittsburgh’s growing autonomous vehicles cluster – including Uber, Aurora, Waymo and Motional – all of which are here because of their strong ties to CMU.

With cutting-edge brain science, path-breaking performances, innovative startups, driverless cars, big data, big ambitions, Nobel and Turing prizes, hands-on learning, and a whole lot of robots, CMU doesn't imagine the future, we create it. 

Many seek Pittsburgh for being a  hot spot for entrepreneurship  and a  model for future cities . Others come for the city's  burgeoning food scene .

Share this job

Get job alerts

Create a job alert and receive personalized job recommendations straight to your inbox.

Similar jobs

Adjunct faculty - mathematics.

  • Cortland, New York, United States

Post Doc Research Associate

  • Hammond, Indiana, United States

Limited Term Lecturer, English as a Second Language

carnegie mellon statistics phd

Match Recap: Men's Tennis | 4/20/2024 4:22:00 PM

#4 CWRU Defeats Carnegie Mellon 6-3 to Wrap-Up Regular Season

The fourth-ranked Case Western Reserve University men's tennis team wrapped up its 2024 regular season with a 6-3 win over 25th-ranked Carnegie Mellon University in Pittsburgh, Pennsylvania on Saturday afternoon. The Spartans improved to 23-4 for the season with the win, ending their regular season on a 10-match winning streak. CWRU also raised its record to 11-2 against nationally ranked Division III opponents. Carnegie Mellon dropped to 6-9 with the loss. CWRU senior Vishwa Aduru and graduate student Diego Maza helped the Spartans take the early lead with an 8-1 win at first doubles against Raghav Jangbahadur and Daniel Kong. The remaining two doubles matches were each close. The Tartans' third doubles duo of Nicholas Wernink and Nadim Motaghedi bested seniors Daniel French and Yuvraj Narang 8-6 to even the team score at 1-1. However, at second doubles, senior Sahil Dayal and sophomore Anmay Devaraj were able to come from behind against Akshay Joshi and Derek Wong, winning in a tiebreaker 8-7 (1) to put CWRU ahead 2-1 heading into singles play. Carnegie Mellon notched the first victory in singles play with Jangbahadur winning 6-3, 6-3 at no. 1 singles against junior Ajay Mahenthiran . However, the Spartans went on to win the next three singles matches to clinch the victory. Aduru put CWRU back in front 3-2 with a 7-5, 6-1 win against Kong at second singles and junior Casey Hishinuma claimed a 1-6, 6-2, 6-2 victory over Jonathan Gu at fifth singles to put the Spartans a point away from the team win. Graduate student Michael Sutanto provided the clinching victory with a 7-6 (5), 6-2 win over Motaghedi at third singles. After the match was decided, Maza secured a 6-2, 7-6 (8) win against Joshi at sixth singles and junior Ansh Shah suffered a three-set setback to Wernink at fourth singles, 6-3, 6-7 (4), 10-6. The Spartans will now turn their attention toward the postseason, beginning with the University Athletic Association Championship Tournament starting on Thursday morning at 8:30 a.m. Seeding for the tournament will be announced early next week.

Site logo

Thanks for visiting !

The use of software that blocks ads hinders our ability to serve you the content you came here to enjoy.

We ask that you consider turning off your ad blocker so we can deliver you the best experience possible while you are here.

Thank you for your support!

3D Bioprinting & Biofabrication

Online Graduate Certificate

Cutting-Edge  Curriculum

Bringing medicine one step closer to the creation of functional organs and tissues.

It is now possible for you to access the training you need to bring 3D bioprinting back to your organization. If you’re one of these trailblazers, our certificate program will give you the tools, inspiration, and knowledge to embark on 3D bioprinting and biofabrication research in your lab.

Curriculum Overview

After you enroll in the 3D Bioprinting and Biofabrication graduate certificate, you will take three graduate-level, credit-bearing courses. Each course will appear on your Carnegie Mellon transcript with the grade earned.

To earn the certificate, you must successfully complete all courses in the program. If you are only interested in one course, however, you may complete that course only and it will show on your transcript with the grade earned. 

The certificate includes the following courses taught by CMU faculty:

Biomaterials.

Course Number:  42-880

Units:  9 units

Study the synthesis, characterization, and functional properties of organic and inorganic biomaterials such as natural biopolymers, synthetic polymers, and soft materials with additional treatment of metals and ceramics. In this course, you will learn fundamental issues related to the utility of biomaterials like biomechanics, transport, degradability, biointerfaces and biocompatibility, stability, and fate in the body, along with basic approaches to characterization. Clinical applications for biomaterials and new directions in design and synthesis to achieve better biocompatibility will also be emphasized.

Course Waivers Students with an educational background in biomaterials are eligible to request a waiver for the Biomaterials course. The process is currently changing from a syllabus review to an assessment in order to show evidence of proficiency for the subject as outlined below:

  • Students who matriculate in Summer 2024 who have successfully completed a biomaterials course with a grade of B or better may submit a transcript and copy of the syllabus to request the waiver as part of the application.
  • Students who matriculate in Fall 2024 and beyond will be required to complete an assessment of their skill level. Indicate your interest in taking the assessment within the application.  

If the waiver is granted, no credit will be earned, nor will tuition be assessed, for the course. For more information about course waivers, contact an admissions counselor today.

Bioprinting & Biofabrication

Course Number:  42-887

Units:  12 units

Gain hands-on experience with methods that are used to fabricate scaffolds often used in tissue engineering, drug delivery, and some medical devices. For example, you will learn plastic filament deposition methods (FDM) to 3D print thermoplastic materials and molds for casting soft hydrogel materials. You will also learn how to 3D bioprint soft hydrogel materials into a support bath material. Topics covered in this course include (but are not limited to): chemical and physical properties of biomaterials, CAD, and post-processing methods.

Tissue Engineering

Course Number:  42-882

Explore advanced cellular and tissue engineering methods that apply physical, mechanical and chemical manipulation of materials in order to direct cell and tissue function. Students will learn the techniques and equipment of bench research including cell culture, immunofluorescent imaging, soft lithography, variable stiffness substrates, application/measurement of forces and other methods. Emphasis will be placed on developing the written and oral communication skills required of the professional scientist. 

The course includes an optional 3 day on-campus lab experience where students can gain practical hands on experience with cell culturing and tissue development. CMU will provide lab access and instruction while students will be responsible for all costs related to travel/hotel.  

Application Deadlines

Priority*: May 14, 2024 Final: July 30, 2024

*All applicants who submit by the priority deadline will receive a partial scholarship award.

Request Info

Questions? There are two ways to contact us. Call  412-501-2150 or send an email to  [email protected] with your inquiries. 

Meet Our World-Class Faculty

Dr. Rachelle Palchesko

Education: Ph.D., Duquesne University

Research Interests:  Tissue engineering,  biomaterials, cell injection therapy, corneal endothelium, cystic fibrosis

Research Focus:  Creating biomimetic  extracellular matrix based materials for tissue engineering and cell delivery. A majority of Dr. Palchesko's research at CMU has focused on treating corneal blindness using  various methods like developing a technology to increase the integration of corneal endothelial cells post-injection for cell injection therapy based approaches. She is currently adapting the cell injection technology for use in the treatment of Cystic Fibrosis and other diseases.

The Building Blocks of Our Curriculum

gold icon of forward thinking

Outside of the Box

There’s the traditional way of performing medicine, and then there’s forward-thinking techniques. At Carnegie Mellon, we empower students to learn the latest technologies and brainstorm big ideas they can put into practice—today or tomorrow. In this program, you will learn the fundamentals of fabricating structurally complex 3D tissues, how to adapt a standard 3D printer for bioprinting purposes, and how to design and fabricate biomedical scaffolds for tissue engineering, drug delivery, and medical devices. This program equips you with everything you need to do what was once thought impossible.

Gold icon of hands-on learning

Hands-On Learning

With 3D bioprinting, you’re not learning if you’re not doing. That’s why our certificate takes an intense hands-on approach. In every course, you will learn how to work with appropriate materials, choose the right cells for tissue engineering, perform fabrication methods, and communicate your results to experts. Best of all, when you’ve finished the coursework, you will understand how to apply this knowledge in a real-life laboratory. Many of our students will take their knowledge back to the benchside to test product biocompatibility, biogradability, and functionality.

Gold icon of microscope and lab consumables

Bench to Bedside

The ultimate goal of 3D bioprinting is to bring us closer to fabricating organs and other tissues for patients. This certificate will prepare you to plan, organize, and lead your own bench-to-bedside project in a small group setting. This rigorous project will require you to select appropriate materials based on their properties, engage in simulated in vitro and in vivo testing of these materials, and follow key steps in the FDA approval process. It’s everything you need to bring 3D bioprinting back to your own lab.

IMAGES

  1. Erik A. Bensen

    carnegie mellon statistics phd

  2. Robert E. Kass Elected to National Academy of Sciences

    carnegie mellon statistics phd

  3. Joel GREENHOUSE

    carnegie mellon statistics phd

  4. Mikael KUUSELA

    carnegie mellon statistics phd

  5. CMU Talent Insider: Statistics and Data Science at Carnegie Mellon

    carnegie mellon statistics phd

  6. Carnegie Mellon University Guide

    carnegie mellon statistics phd

VIDEO

  1. Carnegie Mellon Spots Analytics Conference: Ben Baumer

  2. 24

  3. CMSAC 2023: CMSACamp Students Talk

  4. 05

  5. Carnegie Mellon Philharmonic

  6. 09

COMMENTS

  1. Statistics & Data Science

    The Ph.D. programs of the Department of Statistics at Carnegie Mellon University enable students to pursue a wide range of research opportunities, including constructing and implementing advanced methods of data analysis to address crucial cross-disciplinary questions, along with developing the fundamental theory that supports these methods.

  2. Statistics & Data Science

    Carnegie Mellon University's Department of Statistics & Data Science is world-renowned for the significance of its contributions to statistical theory and practice and for its outstanding interdisciplinary applied research and will prepare you to innovate with data and tackle pressing local, national and global challenges.

  3. Statistics & Data Science

    Past graduate students in Statistics have undergraduate majors in fields such as mathematics, engineering, the sciences, economics, psychology, or administration and management. ... Statistics & Data Science Dietrich College of Humanities and Social Sciences Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA 15213 (412) 268-2717 ...

  4. Department of Statistics and Data Science

    B.S. in Statistics. Glenn Clune, Academic Program Manager Location: Baker Hall 129 [email protected]. Students in the Bachelor of Science program develop and master a wide array of skills in computing, mathematics, statistical theory, and the interpretation and display of complex data.

  5. Ann B Lee

    Welcome! I am a professor in the Department of Statistics & Data Science at Carnegie Mellon University, with a joint appointment in the Machine Learning Department.Prior to joining CMU, I was the J.W. Gibbs Assistant Professor in the department of mathematics at Yale University, and before that I served a year as a visiting research associate at the department of applied mathematics at Brown ...

  6. Aaditya Ramdas' Webpage

    Aaditya Ramdas (PhD, 2015) is an assistant professor at Carnegie Mellon University, in the Departments of Statistics and Machine Learning. He was a postdoc at UC Berkeley (2015-2018) mentored by Michael Jordan and Martin Wainwright, and obtained his PhD at CMU (2010-2015) under Aarti Singh and Larry Wasserman, receiving the Umesh K ...

  7. Apply to the PhD Program

    Application process and Deadlines. The application deadline for the Ph.D. in Public Policy and Management and the Ph.D. in Information Systems and Management is December 1. JOINT PH.D. PROGRAM OPTIONS: If you are interested in the any of the joint Ph.D. degree programs, you should submit only one application for admission:

  8. Joint Machine Learning PhD Degrees

    Similarly, this program differs from the Statistics PhD program in its emphasis on machine learning and computer science. The Joint PhD Program in Machine Learning and Statistics is aimed at preparing students for academic careers in both CS and Statistics departments at top universities or industry. ... Your unofficial Carnegie Mellon ...

  9. PhD Curriculum

    The Machine Learning Department at Carnegie Mellon University is ranked as #1 in the world for AI and Machine Learning, we offer Undergraduate, Masters and PhD programs. ... 10-718* Machine Learning in Practice *Students who are in the joint PhD program in ML & Statistics may satisfy this requirement through the ADA project in Statistics ...

  10. Department of Statistics and Data Science Courses < Carnegie Mellon

    36-202 Methods for Statistics & Data Science. All Semesters: 9 units This course builds on the principles and methods of statistical reasoning developed in 36-200 (or its equivalents). The course covers simple and multiple regression, basic analysis of variance methods, logistic regression, and introduction to data mining including ...

  11. PhD Program in Machine Learning

    The Machine Learning Department at Carnegie Mellon University is ranked as #1 in the world for AI and Machine Learning, we offer Undergraduate, Masters and PhD programs. Our faculty are world renowned in the field, and are constantly recognized for their contributions to Machine Learning and AI. ... statistics, complexity theory, optimization ...

  12. Statistics

    To learn more about how student insurance work at Carnegie Mellon University and/or in United States, please visit Student Insurance Portal. Other requirements. General requirements. ... Update: For the application cycle starting Fall 2022, the GRE test is not required for application to the Statistics PhD program. You may still report the ...

  13. Machine Learning Department

    Ph.D. in Machine Learning. Machine learning is dedicated to furthering scientific understanding of automated learning and to producing the next generation of tools for data analysis and decision-making based on that understanding. The doctoral program in machine learning trains students to become tomorrow's leaders in this rapidly growing area.

  14. Statistics/Machine Learning Joint Ph.D. Degree

    Statistics/Machine Learning Joint Ph.D. Degree ... please send email to: [email protected] (link sends e-mail) ML Joint Program Requirements ... Statistics & Data Science Dietrich College of Humanities and Social Sciences Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA 15213 (412) 268-2717 Contact Us. Legal Info;

  15. Ph.D. in Public Policy and Management

    At Heinz, we live and work at the critical nexus of information technology and public policy. Our Ph.D. in Public Policy & Management was created to train students to approach problems from multiple disciplinary perspectives, to use advanced analytic and theoretical models, and to apply modern technological capabilities such as machine learning to the policy domain.

  16. Statistics & Data Science

    Statistics & Data Science Dietrich College of Humanities and Social Sciences Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA 15213 (412) 268-2717 Contact Us Legal Info www.cmu.edu

  17. PhD

    Carnegie Mellon's Department of Electrical and Computer Engineering offers one undergraduate degree and two graduate degrees, the Master of Science and PhD. Included as part of these degree programs is the ability to complete studies at various campuses throughout the world.

  18. Statistics and Neural Computation, Ph.D.

    About. This Statistics and Neural Computation at Carnegie Mellon University program allows students to pursue a Ph.D. that combines Ph.D.-level training in Statistics with a solid understanding of the elements of neuroscience via the Ph.D. Program in Neural Computation that is part of the Center for the Neural Basis of Cognition.

  19. Graduate admissions

    Graduate admissions. The College of Engineering provides a diverse array of exceptional master's and Ph.D. programs through its nine departments and institutes. On this page, you will find links to learn about admission requirements, the application process, financial aid options, how to request information, and other important details.

  20. Program Plan

    Get insights into CMU's Societal Computing PhD program structure, coursework, research projects, and career progression for aspiring academic researchers and industry leaders. ... Societal Computing Software and Societal Systems Department Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA 15213 412-268-8383. Legal Info; www.cmu.edu ...

  21. PhD Dissertations

    The Machine Learning Department at Carnegie Mellon University is ranked as #1 in the world for AI and Machine Learning, we offer Undergraduate, Masters and PhD programs. ... Nonparametric Learning in High Dimensions Han Liu, 2010 (Joint Statistics & ML PhD) Structural Analysis of Large Networks: Observations and Applications Mary McGlohon, 2010.

  22. Associate Data Scientist job with Carnegie Mellon University

    Associate Data Scientist. What We Do: Data Scientists at the SEI use advanced statistics, data analytics, machine learning, and artificial intelligence to help our government and industry clients research and solve cyber security challenges. In this role, you will work with our customers to identify areas where advanced statistical techniques ...

  23. Office of Graduate and Postdoctoral Affairs

    Carnegie Mellon is a research university with a proud heritage of outstanding graduate and undergraduate education. Our programs are ranked among the top in the country. All of our seven colleges and schools offer Master's and Doctoral degrees and several offer online programs in addition to programs at locations around the world. A cornerstone ...

  24. Statistics and Machine Learning, Ph.D.

    Students at Carnegie Mellon University in this track will be involved in courses and research from both the Departments of Statistics and Machine Learning. ... Update: For the application cycle starting Fall 2022, the GRE test is not required for application to the Statistics PhD program. You may still report the score if you wish. GRE scores ...

  25. #4 CWRU Defeats Carnegie Mellon 6-3 to Wrap-Up Regular Season

    CWRU also raised its record to 11-2 against nationally ranked Division III opponents. Carnegie Mellon dropped to 6-9 with the loss. CWRU senior Vishwa Aduru and graduate student Diego Maza helped the Spartans take the early lead with an 8-1 win at first doubles against Raghav Jangbahadur and Daniel Kong. The remaining two doubles matches were ...

  26. PDF Common Data Set 2021 2022

    Graduate Degree-seeking, first-time 2,051 1,404 150 94 All other degree-seeking 2,671 1,466 352 205 All other graduates enrolled in credit courses 20 19 16 5 Total graduate 4,742 2,889 518 304 TOTAL ALL STUDENTS 8,283 6,512 634 388 2. Enrollment by Racial/Ethnic ategory Degree-seeking First-time First-year,

  27. Statistics and Public Policy, Ph.D.

    About. Our Department at the Carnegie Mellon University offers a joint program in collaboration with the Heinz College of Information Systems and Public Policy, leading to a Ph.D. in Statistics and Public Policy. Carnegie Mellon University. Pittsburgh , Pennsylvania , United States. Top 0.5% worldwide.

  28. CMU's Cutting-Edge Curriculum

    After you enroll in the 3D Bioprinting and Biofabrication graduate certificate, you will take three graduate-level, credit-bearing courses. Each course will appear on your Carnegie Mellon transcript with the grade earned. To earn the certificate, you must successfully complete all courses in the program. If you are only interested in one course ...