As an indispensable approach of one class classification, support vector data description (SVDD) has been studied within diverse research areas and application domains. Distant SVDD (dSVDD) is a variant of SVDD that shows higher identification accuracy. However, dSVDD is caught by the pricy cost and troublesome parameterization, which diminishes its popularity. This paper proposes a fast distant SVDD (fdSVDD) algorithm that addresses above two problems while maintaining the performance. To this end, a new objective that is equivalent to dSVDD’s original objective is proposed firstly; then the least square version of such a new objective serves as the objective of fdSVDD; finally, fdSVDD is implemented by solving a set of linear equations. To handle the parameterization problem, a data-derived heuristic is given. To foster the efficiency, fdSVDD is equipped with the reduction strategy of training data and the specification strategy of support vectors. And in the existence of negative data, fdSVDD is extended to fast parallel SVDD (fpSVDD). In experiments on real datasets, the proposed algorithms exhibit obvious improvement in efficiency and competitive behaviors compared with the peers.