In this talk, we present a graph-based method for revealing the low-dimensional manifold and inferring the underlying processes of time-series. This approach provides intrinsic modeling using empirical information geometry. Unlike traditional information geometry analysis, we compute a Riemannian metric between estimates of the local probability density. Then, a parameterization of the manifold is empirically attained through eigenvectors of an appropriate Laplace operator. The learned model exhibits two important properties. We show that it is invariant under different observation and instrumental modalities and is noise resilient. In addition, the learned model can be efficiently extended to newly acquired measurements in a sequential manner. Provided with such a model, we adopt the state-space formalism and present a framework for sequential processing that is applied to nonlinear and non-Gaussian filtering problems. In addition, we show applications to acoustic signal processing and biomedical signal and image processing.