Most of the existing descent methods suffer from the computational burden in finding a descent direction of the objective function at the point under consideration. This is partly because they must solve a sequence of constrained quadratic programs to obtain a search direction. In this thesis, we suggest a new direction finding subproblem. Our subproblem is minimizing a convex function which is the sum of a norm function and the original objective function. And we present an algorithm based on the subproblem and establish convergence properties for the algorithm. In particular, the algorithm is implementable when a certain norm function is introduced. Furthermore, our implementable algorithm solves a sequence of linear programs, instead of a sequence of constrained quadratic programs, to obtain a descent direction. Limited computational experience with the implementable algorithm is also reported. In view of the computational experience, it is expected that our algorithm will complete successfully with other descent algorithms for minimizing nonsmooth convex functions.