Many real-world datasets live on high-dimensional Stiefel and Grassmannian manifolds, V_k(\mathbbR^N) and Gr(k, \mathbbR^N) respectively, and benefit from projection onto lower-dimensional Stiefel \addand Grassmannian manifolds. In this work, we propose an algorithm called \textitPrincipal Stiefel Coordinates (PSC) to reduce data dimensionality from V_k(\mathbbR^N) to V_k(\mathbbR^n) in an \textitO(k)-equivariant manner (k ≤n ≪N). We begin by observing that each element α∈V_n(\mathbbR^N) defines an isometric embedding of V_k(\RR^n) into V_k(\RR^N). Next, we describe two ways of finding a suitable embedding map α: one via an extension of principal component analysis (\alpha_PCA), and one that further minimizes data fit error using gradient descent (\alpha_GD). Then, we define a continuous and O(k)-equivariant map \pi_αthat acts as a “closest point operator” to project the data onto the image of V_k(\RR^n) in V_k(\mathbbR^N) under the embedding determined by α, while minimizing distortion. Because this dimensionality reduction is O(k)-equivariant, these results extend to Grassmannian manifolds as well. Lastly, we show that \pi_\alpha_PCA globally minimizes projection error in a noiseless setting, while \pi_\alpha_GD achieves a meaningfully different and improved outcome when the data does not lie exactly on the image of a linearly embedded lower-dimensional Stiefel manifold as above. Multiple numerical experiments using synthetic and real-world data are performed.