14th EUROPT Workshop on Advances in Continuous Optimization

Warsaw (Poland), July 1-2, 2016

FE-04:

Friday, 14:00 - 15:00 - Room 146

ICS »




Stream: Robust optimization and applications

Chair: Daniel Reem

  1. On solving an application-based completely positive program of size 3 with further results

    Chee Khian Sim, Qi Fu, Chung Piaw Teo


    We define a completely positive matrix, and introduce a completely positive program. Then we consider a distributionally robust supply chain application and relate it to a completely positive program of size 3. We solve the completely positive program completely, and using results obtained from solving this completely positive program, we provide further results on the supply chain.

  2. Interval global optimization using a template-based package for automatic differentiation and hull consistency enforcing

    Bartlomiej Kubica


    Interval global optimization methods often require computing derivatives of the objective function (and the constraints). This can be done using automatic differentiation techniques, but the existing software (C-XSC) has many limitations and drawbacks. The author is going to present a novel C++ template library, having the following features: *) generating the derivative functions at compile time, *) possibility of using both dense and sparse formats for Hesse matrices, *) potential possibility to computing higher derivatives, *) integration with procedures enforcing hull consistency.

  3. Zero-convexity, perturbation resilience, and subgradient projections for feasibility-seeking methods

    Daniel Reem, Yair Censor


    The convex feasibility problem (CFP) has applications in many areas of science. Working in a Hilbert space, we show that the sequential subgradient projection method converges weakly or strongly to a solution of the CFP despite certain perturbations. Unlike previous works, the functions which induce the feasibility problem's subsets need not be convex. Instead, they satisfy a weaker condition called zero-convexity. Zero-convex functions hold a promise to solve various optimization problems, including (approximate) minimization.

    Ref: Math. Prog. (Ser. A) 152 (2015), 339-380, arXiv:1405.1501