We present a new Frank Wolfe type algorithm that is applicable to minimization problems with a nonsmooth convex objective. We provide convergence bounds and show that the scheme yields so-called coreset results for various machine learning problems including 1-median, balanced development, sparse principal component analysis, graph cuts, and the l1-norm-regularized support vector machine. This means that the algorithm provides approximate solutions to these problems in time complexity bounds that are not dependent on the size of the input data. Our framework, motivated by a growing body of work on sublinear algorithms for various data analysis problems, is entirely deterministic and makes no use of smoothing or proximal operators. Apart from these theoretical results, we show experimentally that the algorithm is very practical and in some cases also offers significant computational advantages on large problem instances. We provide an open source implementation that can be adapted for other problems that fit the overall structure.