14th EUROPT Workshop on Advances in Continuous Optimization

Warsaw (Poland), July 1-2, 2016

SC-3:

Saturday, 11:10 - 12:50 - Room 108

ICS »




Stream: Convex and non-smooth optimization

Chair: Oliver Stein

  1. Lipschitz continuous positively homogeneous functions

    Marina Trafimovich


    In the talk we endow the space of Lipschitz continuous positively homogeneous functions with the structure of the Banach space with the Lipschitz modulus as the norm. It was proved that the Lipschitz modulus norm is stronger than the uniform norm on the space of Lipschitz continuous positively homogeneous functions and weaker than the Bartels-Pallaschke norm on the space of difference-sublinear functions [1].

    [1] Valentin V.Gorokhovik, Marina Trafimovich Positively Homogeneous Functions Revisited. DOI 10.1007/s10957-016-0891-4. 2016.

  2. Semi-quasidifferentiable multiobjective optimization

    Majid Soleimani-damaneh, Alireza Kabgani


    In this paper, we introduce the concept of Semi-quasidifferentiable functions motivated by the convexificator notion in nonsmooth analysis. Some properties of semi-quasidifferentials are established. Considering a multiobjective optimization problem, a characterization of the normal cone of the feasible set is provided and it is used in deriving necessary optimality conditions. We close the paper by obtaining optimality conditions in multiobjective optimization in terms of the semi-quasidifferentials. Some outcomes of the current work generalize some results existing in the literature.

  3. Solving Some of the Largest Problems of the Literature by the Accelerated Hyperbolic Smoothing Clustering Method

    Adilson Elias Xavier, Vinicius Layter Xavier


    The work considers the solution of the minimum sum-of-squares clustering problem by the Hyperbolic Smoothing method in connection with a partition scheme of observations into two non-overlapping groups: data in frontier and data in gravitational regions, which drastically simplify the computational tasks. For the purpose of illustrating both the reliability and the efficiency of the method, a set of computational experiments making use of traditional very large test problems described in the literature was performed, producing unprecedented results in the context of the clustering analysis.

  4. Linear generalized Nash equilibrium problems and nonsmooth optimization

    Nathan Sudermann-Merx, Oliver Stein, Axel Dreves


    We introduce linear generalized Nash equilibrium problems (LGNEPs) and study a reformulation of the LGNEP as nonsmooth nonconvex optimization problem. It is possible to characterize the nondifferentiability points of the objective function in terms of a new regularity condition, the so-called cone condition. Furthermore, we present some promising numerical results based on a simple projected subgradient method which, especially for large problems, outperforms other numerical approaches from literature.

  5. Solving Disjunctive Optimization Problems by Generalized Semi-infinite Optimization Techniques

    Oliver Stein, Peter Kirst


    We describe a new possibility to model disjunctive optimization problems as generalized semi-infinite programs. In contrast to existing methods in disjunctive programming, our approach does not expect any special formulation of the underlying logical expression. Applying existing lower level reformulations for the corresponding semi-infinite program we derive conjunctive nonlinear problems without any logical expressions, which can be locally solved by standard nonlinear solvers. Our preliminary numerical results indicate that our reformulation procedure is a reasonable.