This research summary synthesises findings from recent papers at the intersection of formal methods, stochastic systems, and machine learning. The reviewed literature reveals a strong trend towards leveraging deep generative models (e.g., Diffusion models, VAEs, GANs) for tasks such as model abstraction, forecasting, and planning. A critical parallel trend is the integration of these data-driven methods with rigorous statistical guarantees, primarily through Conformal Prediction (CP) and Bayesian inference, to ensure reliability and safety. The papers are organized into four macro-topics: 1) Conformal Prediction for Runtime Verification and Monitoring, which focuses on providing efficient, statistically guaranteed predictions for temporal logic properties; 2) Scalable Bayesian Verification and Uncertainty Quantification, which addresses the challenge of scaling parametric verification to high dimensions; 3) Neuro-Symbolic and Generative Model Discovery, which explores methods to automatically learn interpretable, mechanistic models from data; and 4) Generative Abstractions and Guided Sampling, which uses generative models to create fast simulators and to steer outputs towards satisfying complex constraints. The collective findings indicate a paradigm shift from purely simulation-based analysis to a hybrid, data-driven approach that does not sacrifice formal guarantees, enabling the application of formal verification to larger, more complex systems.