Scielo RSS <![CDATA[Pesquisa Operacional]]> vol. 36 num. 2 lang. en <![CDATA[SciELO Logo]]> <![CDATA[A SURVEY ON HEURISTICS FOR THE TWO-DIMENSIONAL RECTANGULAR STRIP PACKING PROBLEM]]> ABSTRACT Two-dimensional rectangular strip packing problems belong to the broader class of Cutting and Packing (C&amp;P) problems, in which small items are required to be cut from or packed on a larger object, so that the waste (unused regions of the large object) is minimized. C&amp;P problems differ from other combinatorial optimization problems by the intrinsic geometric constraints: items may not overlap and have to be fully contained in the large object. This survey approaches the specific C&amp;P problem in which all items are rectangles, therefore fully characterized by a width and a height, and the large object is a strip, i.e. a rectangle with a fixed width but an infinite height, being the problem's goal to place all rectangles on the strip so that the height is minimized. These problems have been intensively and extensively tackled in the literature and this paper will focus on heuristic resolution methods. Both the seminal and the most recent approaches (from the last decade) will be reviewed, in a rather tutorial flavor, and classified according to their type: constructive heuristics, improvement heuristics with search over sequences and improvement heuristics with search over layouts. Building on this review, research gaps are identified and the most interesting research directions pointed out. <![CDATA[OPTIMALITY AND PARAMETRIC DUALITY FOR NONSMOOTH MINIMAX FRACTIONAL PROGRAMMING PROBLEMS INVOLVING L-INVEX-INFINE FUNCTIONS]]> ABSTRACT The Karush-Kuhn-Tucker type necessary optimality conditions are given for the nonsmooth minimax fractional programming problem with inequality and equality constraints. Subsequently, based on the idea of L-invex-infine functions defined in terms of the limiting/Mordukhovich subdifferential of locally Lipschitz functions, we obtain sufficient optimality conditions for the considered nonsmooth minimax fractional programming problem and also we provide an example to justify the existence of sufficient optimality conditions. Furthermore, we propose a parametric type dual problem and explore duality results. <![CDATA[A NEW MATHEMATICAL MODEL FOR THE WORKOVER RIG SCHEDULING PROBLEM]]> ABSTRACT One of the most important activities in the oil and gas industry is the intervention in wellsfor maintenance services, which is necessary to ensure the constant production of oil. These interventions are carried out by workover rigs. Thus, the Workover Rig Scheduling Problem (WRSP) consists of finding the best schedule to service the wells while considering the limited number of rigs with the objective of minimizing the total production loss. In this study, a 0-1 integer linear programming model capable of efficiently solving the WRSP with a homogeneous fleet of rigs is proposed. Computational experiments were carried out using instances based on real cases in Brazil to compare the results obtained by the proposed model with the results reported by other methods. The proposed model was capable of solving all instances considered in a reduced computational time, including the large instances for which only approximatesolutions were presently known. <![CDATA[CONSIDERATIONS REGARDING THE CHOICE OF RANKING MULTIPLE CRITERIA DECISION MAKING METHODS]]> ABSTRACT Various methods, known as Multiple Criteria Decision Making Methods (MCDM), have been proposed to assist decision makers in the process of ranking alternatives. Given the variability of available methods, choosing an MCDM ranking method is a difficult task. There are key factors in the process of choosing an MCDM method such as: (i) available time; (ii) effort required by a given approach; (iii) importance of accuracy; (iv) transparency necessity; (v) conflict minimization necessity; and (vi) facilitator's skill with the method. However, the problem is further increased by the knowledge that the solution of MCDM ranking methods may be sensitive to slight variations in entrance data and, in some cases, might replace the best alternative for the worst when the weightings for the criteria are changed. Some researchers have addressed this problem through different approaches, including the evaluation of MCDM ranking methods in the sense of predicting the initial rankings given by the decision maker. The objective of this study is to propose an empirical experiment to evaluate the propensity for initial ranking prediction of the principal MCDM ranking methods, namely: SAW, TOPSIS, ELECTRE III, PROMETHEE II and TODIM. The study also aimed to assess possible common ranking problems in MCDM methods, such as reversibility, found in the literature. It was found that just up to 20% of initial ranking order was predicted entirely correct by some of the methods. It was also found that just a few methods did not present internal ranking inconsistency. The results of this study and those found in the literature give a warning regarding the choice of MCDM ranking methods. It is suggested that special care must be taken in the choice of methods and, besides axiomatic comparisons, ranking comparisons could be a useful way to enhance the decision making process, since MCDM methods are tools for learning about the problem and do not prescribe solutions. <![CDATA[PACKING CIRCLES WITHIN CIRCULAR CONTAINERS: A NEW HEURISTIC ALGORITHM FOR THE BALANCE CONSTRAINTS CASE]]> ABSTRACT In this work we propose a heuristic algorithm for the layout optimization for disks installed in a rotating circular container. This is a unequal circle packing problem with additional balance constraints. It proved to be an NP-hard problem, which justifies heuristics methods for its resolution in larger instances. The main feature of our heuristic is based on the selection of the next circle to be placed inside the container according to the position of the system's center of mass. Our approach has been tested on a series of instances up to 55 circles and compared with the literature. Computational results show good performance in terms of solution quality and computational time for the proposed algorithm. <![CDATA[MONITORING THE STATOR CURRENT IN INDUCTION MACHINES FOR POSSIBLE FAULT DETECTION: A FUZZY/BAYESIAN APPROACH FOR THE PROBLEM OF TIME SERIES MULTIPLE CHANGE POINT DETECTION]]> ABSTRACT This paper addresses the problem of fault detection in stator winding of induction machine by a multiple change points detection approach in time series. To handle this problem a new fuzzy/Bayesian approach is proposed which differs from previous approaches since it does not require prior information as: the number of change points or the characterization of the data probabilistic distribution. The approach has been applied in the monitoring the current of the stator winding induction machine. The good results obtained by proposed methodology illustrate its efficiency. <![CDATA[A COMPARATIVE STUDY BETWEEN ARTIFICIAL NEURAL NETWORK AND SUPPORT VECTOR MACHINE FOR ACUTE CORONARY SYNDROME PROGNOSIS]]> ABSTRACT Despite medical advances, mortality due to acute coronary syndrome remains high. For this reason, it is important to identify the most critical factors for predicting the risk of death in patients hospitalized with this disease. To improve medical decisions, it is also helpful to construct models that enable us to represent how the main driving factors relate to patient outcomes. In this study, we compare the capability of Artificial Neural Network (ANN) and Support Vector Machine (SVM) models to distinguish between patients hospitalized with acute coronary syndrome who have low or high risk of death. Input variables are selected using the wrapper approach associated with a mutual information filter and two new proposed filters based on Euclidean distance. Because of missing data, the use of a filter is an important step in increasing the size of the usable data set and maximizing the performance of the classification models. The computational results indicate that the SVM model performs better. The most relevant input variables are age, any previous revascularization, and creatinine, regardless of the classification algorithms and filters used. However, the Euclidean filters also identify a second important group of input variables: age, creatinine and systemic arterial hypertension. <![CDATA[AN ALTERNATIVE REPARAMETRIZATION FOR THE WEIGHTED LINDLEY DISTRIBUTION]]> ABSTRACT Recently,12 introduced a generalization of a one parameter Lindley distribution and named it as a weighted Lindley distribution. Considering this new introduced weighted Lindley distribution, we propose a reparametrization on the shape parameter leading it to be orthogonal to the other shape parameter. In this alternative parametrization, we get a direct interpretation for this transformed parameter which is the mean survival time. For illustrative purposes, the weighted Lindley distribution on the new parametrization is applied on two real data sets. The one parameter Lindley distribution and its generalized form are fitted for the considered data sets. <![CDATA[INFERENCE OF THE MARKET VALUE OF URBAN LOTS FROM THE PERSPECTIVE OF SURVIVAL ANALYSIS. CASE STUDY: CITY OF SÃO CARLOS, SÃO PAULO, BRAZIL]]> ABSTRACT The survival analysis was originally proposed for data analysis related to time before the occurrence of a specific event of interest and has been widely used in studies of biomedical data (survival analysis), industrial research (reliability analysis) and financial data (credit scoring). In this study, we presented a new approach to modeling the market value of urban lots based on survival analysis considering the left censoring mechanism, which allows estimating the probability for sale and the hazard associated with sales at the lot market value. The modeling is made from the survival analysis that introduces greater flexibility when compared to the usual linear models as it allows including the effectively traded lots (not censored) and lots in negotiation (censored) into the process, however, it is not enough to affirm the effective improvement of models based on the survival analysis. <![CDATA[MODELS AND METHODS FOR LOGISTICS HUB LOCATION: A REVIEW TOWARDS TRANSPORTATION NETWORKS DESIGN]]> ABSTRACT Logistics hubs affect the distribution patterns in transportation networks since they are flow-concentrating structures. Indeed, the efficient moving of goods throughout supply chains depends on the design of such networks. This paper presents a literature review on the logistics hub location problem, providing an outline of modeling approaches, solving techniques, and their applicability to such context. Two categories of models were identified. While multi-criteria models may seem best suited to find optimal locations, they do not allow an assessment of the impact of new hubs on goods flow and on the transportation network. On the other hand, single-criterion models, which provide location and flow allocation information, adopt network simplifications that hinder an accurate representation of the relationshipbetween origins, destinations, and hubs. In view of these limitations we propose future research directions for addressing real challenges of logistics hubs location regarding transportation networks design.