Semidefinite programming is a powerful framework from convex optimization that has striking potential for data science applications. Even so, practitioners often critique this approach by asserting that it is impossible to solve semidefinite programs at the scale demanded by real-world applications. In general, storage and arithmetic costs prevent us from solving many large-scale optimization problems. As a result, there is a recent trend where heuristics with unverifiable assumptions are overtaking more rigorous, conventional optimization techniques at the expense of robustness. My recent research results show that we can overturn this trend by exploiting randomization, dimensionality reduction and adaptivity at the core of optimization. In this talk, we would like to argue that the classical convex optimization did not reach yet its limits of scalability, and present a new optimization algorithm that can solve very large semidefinite programming instances to moderate accuracy using limited arithmetic and minimal storage.