Publications and preprints in high-dimensional statistics, random matrix theory, machine learning, deep learning, manifold learning, time series analysis, and fast scalable computation for massive datasets, with applications to single-cell RNA sequencing, EEG/ECG/EMG and respiratory signal analysis, neuroimaging, and brain-computer interfaces.
Research Papers
# denotes students under my supervision.
Y. Wu, X. Ding , J. Zhang, W. Lan, and C.-L. Tsai. A spectral inference method for determining the number of communities in networks , arXiv: 2603.04315.
X. Ding , C. Shen, and H.-T. Wu. Generalized Robust Adaptive-Bandwidth Multi-View Manifold Learning in High Dimensions with Noise , arXiv: 2602.10530.
X. Ding and Y. Hu# . On the edge eigenvalues of sparse random geometric graphs , arXiv: 2509.07372.
C. Qian# , X. Ding , and L. Li. Structural Classification of Locally Stationary Time Series Based on Second-order Characteristics , arXiv: 2507.04237. Code
X. Ding and Z. Zhou. Simultaneous Sieve Estimation and Inference for Time-Varying Nonlinear Time Series Regression , arXiv: 2506.23069. Package
X. Ding , J. Xie# , L. Yu and W. Zhou. Multiplier bootstrap meets high-dimensional PCA: the good, the bad and the modification
C. Abi Younes, X. Ding , and T. Trogdon. (alphabetical order). A Lanczos-Based Algorithmic Approach for Spike Detection in Large Sample Covariance Matrices , arXiv: 2504.03066. Code
X. Ding , Y. Li and F. Yang. Eigenvector distributions and optimal shrinkage estimators for large covariance and precision matrices , arXiv: 2404.14751. Package
X. Ding , Y. Hu# and Z. Wang# . Two sample test for covariance matrices in ultra-high dimension , Journal of the American Statistical Association , 120 (552): 2210–222,2025. Package
X. Ding and R. Ma. Kernel spectral joint embeddings for high-dimensional noisy datasets using duo-landmark integral operators , Journal of the American Statistical Association , 120(551): 1463- 1476, 2025. Code
X. Ding and Z. Wang# . Global and local CLTs for linear spectral statistics of general sample covariance matrices when the dimension is much larger than the sample size with applications , IEEE Transactions on Information Theory , 71(8):6248 - 6296, 2025. Code
X. Ding and J. Xie# . Tracy-Widom distribution for the edge eigenvalues of elliptical model , Information and Inference: A Journal of the IMA , 14(2):iaaf004, 2025.
X. Ding and Z. Zhou. On the partial autocorrelation function for locally stationary time series: characterization, estimation and inference , Biometrika , 112(2):asaf016, 2025. Package
X. Ding and H.-T. Wu. How do kernel-based sensor fusion algorithms behave under high dimensional noise? , Information and Inference: A Journal of the IMA , 13(1):iaad051, 2024.
X. Ding and T. Trogdon. A Riemann--Hilbert approach to the perturbation theory for orthogonal polynomials: Applications to numerical linear algebra and random matrix theory , International Mathematics Research Notices , 2024(5):3975-4061, 2024.
X. Ding and R. Ma. Learning Low-Dimensional Nonlinear Structures from High-Dimensional Noisy Data: An Integral Operator Approach , The Annals of Statistics , 51(4): 1744-1769, 2023. Code
X. Ding and H. Ji. Local laws for multiplication of random matrices , The Annals of Applied Probability , 33(4): 2981--3009,2023.
X. Ding and H.-T. Wu. Impact of signal-to-noise ratio and bandwidth on graph Laplacian spectrum from high-dimensional noisy point cloud , IEEE Transactions on Information Theory , 69(3):1899-1931,2023.
X. Ding and Z. Zhou. Auto-regressive approximations to non-stationary time series, with inference and applications , The Annals of Statistics , 51(3):1207--1231,2023. Package
X. Ding and H. Ji. Spiked multiplicative random matrices and principal components , Stochastic Processes and their Applications , 163:25-60,2023.
X. Ding and F. Yang. Tracy-Widom distribution for heterogeneous Gram matrices with applications in signal detection , IEEE Transactions on Information Theory , 68(10):6682-6715, 2022. Package
X. Ding and F. Yang. Edge statistics of large dimensional deformed rectangular matrices , Journal of Multivariate Analysis , 192:105051, 2022.
Z. Bao, X. Ding , J. Wang, and K. Wang. (alphabetical order). Statistical inference for principal components of spiked covariance matrices , The Annals of Statistics , 50(2):1144-1169, 2022.
X. Ding and T. Trogdon. The conjugate gradient algorithm on a general class of spiked covariance matrices , Quarterly of Applied Mathematics , 80(1):99-155, 2022.
X. Ding , and F. Yang. Spiked separable covariance matrices and principal components , The Annals of Statistics , 49(2): 1113-1138, 2021.
X. Ding , D. Yu, Z. Zhang, and D. Kong. Multivariate functional responses low rank regression with an application to brain imaging data , The Canadian Journal of Statistics , 49(1):150-181, 2021.
Z. Bao, X. Ding , and K. Wang. (alphabetical order). Singular vector and singular subspace distribution for the matrix denoising model , The Annals of Statistics , 49(1): 370-392,2021.
X. Ding . Spiked sample covariance matrices with possibly multiple bulk components , Random Matrices: Theory and Applications , 10(1):2150014, 2021. Package
X. Ding , and H.-T. Wu. On the spectral property of kernel-based sensor fusion algorithms of high dimensional data , IEEE Transactions on Information Theory , 67(1):640 - 670, 2021.
X. Ding , and Z. Zhou. Estimation and inference for precision matrices of nonstationary time series , The Annals of Statistics , 48(4):2455-2477, 2020.
X. Ding . High dimensional deformed rectangular matrices with applications in matrix denoising , Bernoulli , 26(1): 387-417, 2020. Package
X. Ding . Singular vector distribution of sample covariance matrices , Advances in Applied Probability ,51(1): 236-267, 2019.
X. Ding , and F. Yang. A necessary and sufficient condition for edge universality at the largest singular values of covariance matrices , The Annals of Applied Probability , 28(3):1679-1738, 2018.