Abstract in English:
ABSTRACT This paper aims to support decision-making for the return of classes to determine actions to reduce the problems arising from the pandemic through a multi-methodological approach using Value-focused thinking (VFT) and Rich Picture (RP). The case study includes a Brazilian small university, in a region with social and economic vulnerability. Six stakeholders representing the students, the municipality community and the university were interviewed from which the hierarchy of fundamental objectives and an objectives’ network unfolded in education, health, and community were created. Even though health concerns were shared by all stakeholders, these networks revealed a variety of perspectives not typically discussed in large universities, such as access to information and communication technologies; awareness of biosafety measures; and real estate vacancy. These findings indicate that future research should address the specific needs of small universities in vulnerable cities, particularly those in developing and underdeveloped countries, where universities’ presence fosters the economy.Abstract in English:
ABSTRACT The choice and distribution of the optimal fleet mix of helicopters to service the offshore air operation is an important logistical problem that has a potential to reduce costs considered in the oil and gas industry. The variability of helicopter models and sizes, as well as the possible operational restrictions of airport bases and airspace, requires that adequate numerical techniques of Operational Research are applied. In this context, this work develops a model to determine the trajectories and distances traveled between airport bases and offshore maritime units and vice-versa, through an airspace modeled by a directed graph. Also develop a helicopter performance model to obtain the estimated payload and flight times for each specified mission and last but not least an integer linear programming model to allocate an optimal fleet and airport mix for passenger transport through the use of different helicopter sizes. The model was applied to offshore units operating in the Santos basin due to their strategic importance in the oil and gas industry. The results obtained included a map of preferred regions for two different helicopter sizes (large and medium), the weekly flight tables showing the allocation of helicopters and airport for different demand scenarios, additionally the impact of fuel prices on different airport bases and the concentration of movements on air routes are also evaluated in order to complement the analysis.Abstract in English:
ABSTRACT This work considers the strategic Supply Chain Network Design (SCND) problem, which is to define the number and location of facilities, and the flow of products among them to fulfilling a long-term deterministic demand. A two-phase heuristic approach was specially developed to solve large scale problems in reasonable time, extending a previous algorithm introduced in Farias et al. (2017). In the construction phase, a multi-start approach was developed to generate diversified initial solutions from each new iteration of a layered-based rounding heuristic. In the second phase, a local search heuristic improves the solution provided by the rounding method. The solution method is evaluated using randomly generated instances, and a evaluated strategic of marketing in a real case study applied to a company to redesigning the supply chain to two lines of products.The obtained results evidence the effectiveness and flexibility of the developed approach for handling very large instances.Abstract in English:
ABSTRACT The COS method was introduced in Fang & Oosterlee (2008) and then was applied to pricing a variety of stock options for continuous random variables. This paper adapts the Fourier-cosine series (COS) method to recover discrete probability mass functions. We approximate mixture and compound probability distributions with cosine series. Enormous precision and computational speed are the qualities of the function estimates here obtained. We also develop the pricing framework to trade derivatives subject to discrete random variables. We apply the method to calculate, for the first time, the price of an interest rate derivative of recent vintage introduced in the Brazilian financial market. Parameter calibration confirms the quality of the model.Abstract in English:
ABSTRACT In this paper, we propose a multi-objective evolutionary metaheuristic approach based on the Pareto Ant Colony Optimization (P-ACO) metaheuristic and the non-dominated genetic sorting algorithms (NSGA II and NSGA III) to solve a bi-objective portfolio optimization problem. P-ACO is used to select the best assets composing the efficient portfolio. Then, NSGA II and NSGA III are separately used to find the proportional weights of the budget allocated to the selected portfolio. The results we obtained by these two algorithms were compared to designate the best performing algorithm. Finally, we performed another comparison between our results and those of an exact method used for the same problem. The numerical experiments performed on a set of instances from the literature revealed that the combination of the ant colony optimization metaheuristic and the NSGA III genetic algorithm that we proposed most often gave much better results than both the combination of the ant colony optimization metaheuristic and NSGA II on the one hand and the iterative approach on the other hand.Abstract in English:
ABSTRACT Private banking is one of the most profitable services offered by financial institutions in Brazil. In this service, a client should consider the suitability of a range of banking product, during his or her life based on dynamic objectives. It is a very complex service that involves regulation from governmental agencies, financial institutions’ interests, and clients’ objectives. It is not uncommon for these aspects to be conflicting, hindering the decision-making process and leading to unwanted decisions. This paper describes the development of a decision support system (DSS) for private banking, called OptPrivate, that integrates the suitability assessment with the portfolio selection, enabling the investor to choose a portfolio based on several and conflicting objectives. The DSS considers the investment selection problem in two interconnected subproblems: suitability and capital allocation. In the former, the DSS is based on an integrated modeling approach that considers legal aspects and investors’ preferences throughout a fuzzy multiple-attribute decision-making approach (MADM), called FTOPSIS-Class. In the latter, a fuzzy multi-objective optimization model uses the sorting results from the previous step to define a portfolio, simultaneously considering risk, return, and the investor’s profile. To facilitate the DSS application, linguistic variables are used in several aspects of the decision-making process, including risk exposure. The DSS was validated using field tests at a well-respected private bank in Brazil. The DSS recommends more suitable portfolios, in line with the investor’s profile, with greater profitability and less volatility than those recommended by the financial analysts for all test cases.Abstract in English:
ABSTRACT This study addresses the application of the Stochastic Multicriteria Acceptability Analysis (SMAA) technique, in combination with the Exponential TODIM method (ExpTODIM), by taking into consideration proposition made by Petrobras to the Brazilian National Agency for Petroleum, regarding the decommissioning of both Marlim and Voador production fields. The new approach called SMAA-ExpTODIM takes into consideration different perspectives of participants involved in decision making regarding decommissioning projects and incorporates them to its results. Besides incorporating decision makers’ uncertainties and ignorance zones, this new approach explicitly considers the decision making based on the different perspectives of the Proponent and of the Regulator. The application of the SMAA-ExpTODIM approach increased robustness by adopting different probability functions for stochastic variables, for the process of formulating and modeling the decommissioning issue. This was undertaken to represent profiles associated with aversion or propensity to risks faced by institutional decision makers. For each decision maker profile, different preference structures or comfort levels are created with a choice between decision alternatives.Abstract in English:
ABSTRACT The Brazilian Navy (BN) has its combatant force concentrated in an area with only one exit to the sea. Nevertheless, Brazil has an extensive coast that needs to be patrolled and, in case of any kind of conflict, the BN must be able to defend the sovereignty and interests of the country. For this purpose, a robust and decentralized naval force is necessary, which allows the fulfillment of its mission. This paper aims to classify the most suitable cities in Brazil to be established as naval bases by applying the ELECTRE-MOr hybrid method. Seventeen cities were analyzed, which have an opening to the sea and a pre-existing minimum port structure. The alternatives were evaluated by specialists from the BN, with more than twenty years of active service, in the light of operational, economic and strategic criteria. As a result, five alternatives were classified as the most indicated headquarters for the decentralization of the naval power: Belém/PA, São Luis/MA, Fortaleza/CE, Recife/PE and Aratu/BA. This article presents relevance to academia and society, as it represents the application of a method in the state of the art to support the decision-making process in a significant problem for the sovereignty of Brazil, presenting a methodology that can be applied in issues of tactical, operational, strategic and political levels.Abstract in English:
ABSTRACT There are different decision levels with distinct decision makers in a decentralized Supply Chain, for example, shippers and carriers. Nevertheless, most studies are conducted considering only one decision-making level. The carrier is the decision-making agent in Dynamic Vehicle Allocation (DVA) problem and allocates vehicles to maximize its profit, usually delaying shipping. It is necessary to respect the partnership principle. This paper presents the DVA problem using bi-level programming. The shipper’s objective is to minimize shipping delays, while the carrier’s objective is to maximize profits. The exact algorithm is used to solve the Bi-level DVA problem. The results show potential applications in logistics, decreasing both transportation delays and costs when synchronizing carrier’s and shippers’ decisions.Abstract in English:
ABSTRACT This study analyzes performance measures in a Call Center with impatient customers, who may leave the system before being served. The abandonment of the waiting line is a measure of the subjective perception of the quality of the services offered, being one of the main concerns of the managers of Call Center systems. Based on data extracted from the case study of a Brazilian company, it is shown that the analytical queuing models M/Gc /1+G and M/G/c+G, with generic distributions to represent service times and generic distributions (particularly mixed distributions) to represent patience times, can be effective to evaluate the congestion problem of this Call Center. The sampling performance measures of this case are compared with the measurements obtained through the M/Gc /1+G and M/G/c+G models, using non-mixed and mixed distributions based on Exponential, Fatigue Life and Normal distributions to represent customers’ patience. We are not aware of other studies in the literature in this line of research. The results indicate that, in general, the use of analytical queuing models with abandonment, exploring more elaborate distributions to model service times and mixed distributions for patience times, are more accurate in evaluating this Call Center than other queuing models with abandonment, such as the M/M/c+M, M/M/c+G and M/G/c+M.Abstract in English:
ABSTRACT Air traffic management has become increasingly complex due to the increasing use of air transport. One of the main management bottlenecks is planning the efficient use of runways for takeoff and landing. This paper aims to investigate the Aircraft Landing Problem, which seeks to minimize earliness and tardiness in aircraft landing time, assigning them to a runway to land and sequencing them. A new mathematical formulation based on Job Shop was proposed for the problem, comparing it with four mathematical formulations in the literature; three directly comparable and another containing a particularity that does not allow a direct comparison with the other formulations. Computational tests were performed on 49 instances of the literature using the Gurobi Optimizer optimization package. These mathematical formulations commonly used for the ALP present difficulties in finding the optimal solution when the number of aircraft to land is large, i.e., more than 50 aircraft. Therefore, we proposed a matheuristic to solve instances with a greater number of aircraft than the Gurobi Optimizer cannot solve optimally. This matheuristic first finds an initial solution using relax-and-fix (RF) and then fix-and-optimize (FO) improves the found solution. Comparisons were also made using the first feasible solution obtained by Gurobi and then was improved with FO. Among the matheuristic variations, the one that obtained the best result was the combination of RF with FO and this also showed efficiency in relation to the work in the literature that uses FO.Abstract in English:
ABSTRACT Parametric DEA models presume a functional form for the efficiency frontier and are used for resource redistribution. Parabolic DEA is a particular case of these models used to redistribute a single input in Variable Returns to Scale scenarios. In this study, we extend this model to redistribute multiple inputs simultaneously, ensuring that all DMUs will become extreme efficient after redistribution maintaining their outputs. The proposed model is a single multi-objective Linear Programming Problem (PPL) which provides a single optimal solution. To solve this model, two approaches are used, the Weighted Sum of the Objective Functions and the Separation of Variables. Two numerical examples considering single and multiple outputs are used and the results obtained are identical for the two approaches.Abstract in English:
ABSTRACT This study targets six North Asia and African hospitals that provide clinical treatment, diagnostic, and consultation services. The study revealed a lack in the selected hospitals’ infrastructure to support implementation successfully Environmental Management Systems (EMS). A factor analysis was performed based on five factors; EMS efficiency, the hospital’s culture, support from top management, legislation, and time and budget controls. Out of 23 sub-factors tested, it has been found that 7 of them can control the differentiation of the factors. The results highlight human errors in chemical and medical materials that lead to environmental pollution. It was followed by a sub-factor that indicated that controls of hospital activities and correction of its performance depend on the feedback received from the environmental information system. The study’s main conclusion is that recovering green hospital systems depends on successfully carrying out the ISO 14001: 2015 standard.Abstract in English:
ABSTRACT In this research article, the service provision in a public company was evaluated through multivariate statistical control to determine the performance of its dimensions. For this purpose, the methodology used was: 1) characterization of the information associated with the quality dimensions provided, through consolidated databases recognized for their high level of quality, such as: Elsevier, Inderscience, among others; 2) calculation of Six Sigma metrics (DPMO, Z-level and performance), which will allow from a monthly average, to evaluate the quality of the service provided by the company in a timely and periodic manner in the 12 periods of 2019; 3) Evaluation of the performance of the service dimensions in a global and comprehensive manner, through multivariate analysis. Finally, the quality of the company’s service is presented. Thus, allowing the control and continuous improvement of the processes, through its prompt replanting.Abstract in English:
ABSTRACT Selecting the naval components to be part of a task force is a typical setting decision problem wherein each component of the task force should complement the others. This decision situation calls for the use of outranking modeling tools instead of the additive ones, which are usually used when selecting group components. This study applies Electre I to develop an original approach to this hypothetical problem: the composition of the Amphibian Readiness Group, used in the United States. Despite being a hypothetical decision situation, the work uses real data from American media. We use Electre I because it is a native outranking multicriteria decision method, designed to address choice decisions. Moreover, we take the support of The Visual Outdeck web app (Outranking Decision and Knowledge app) to implement the model and the Electre I multicriteria algorithm and a sensitivity analysis of the results regarding the weights and the agreement and disagreement cut-off levels. Further we apply the same input data to an additive decision algorithm, allowing comparison of the results. Consequently, we established the setting of the task force by utilizing elements that optimally complement the others in the task force configuration. Furthermore, the study finds that this task force differs from the one that should be constituted using traditional additive modeling. The main contribution of this study is that it provides a robust discussion and original modeling that supports military decision-makers while selecting a better set of military taskforces.Abstract in English:
ABSTRACT This research takes into account production inventory models with price-dependent are developed: The first model uses integrated stock and price dependent demands, the second model uses stock dependent demand, and the third model uses price dependent demand. The models are built on the basis of a bipartition of the production cycle which in turn results in the holding cost. It was found that the holding cost is lower in integrated stock and price dependent demand compared to that of stock dependent demand and price dependent demand individually. Detailed mathematical models are presented for each model, as well as applicable illustrations are given for making the suggested technique clearer. In this scenario, the goal is to determine the order amounts and order intervals that will result in the lowest total cost. Each of the three models has its own individual sensitivity analysis offered. Visual Basic 6.0 was used in order to produce the required data.Abstract in English:
ABSTRACT Processes that establish Compliance at first do not seem to add to the value chain of companies. However, the need to address legislation or issues related to corporate governance, social management, and the environment, lead large corporations to adopt such processes. This article aimed to establish a plan for prioritizing the implementation of compliance processes in an electric power generation company, through the method Value-Focused Thinking (VFT) of structuring problems, and the application of the new hybrid multicriteria method SAPEVO-WASPAS-2N, derived from the unprecedented junction of SAPEVO-M (Simple Aggregation of Preferences Expressed by Ordinal Vectors - Multi Decision Makers) and WASPAS-2N (Weighted Aggregated Sum Product Assessment) with two standardization techniques. The application of the hybrid model SAPEVO-WASPAS-2N proved to be consistent and robust, generating two possibilities of ordering priorities aligned with the strategic situation of the organization based on the criteria established through the opinion of the decision makers.Abstract in English:
ABSTRACT There are liabilities for society because of the increase in the amount of waste. This work will choose a biodigester to treat organic waste at Universidade Federal Fluminense. For this, the VFT approach was used to structure the problem and the multicriteria decision support method THOR 2 was used to choose the option that has the most excellent preference considering three scenarios. The experts say that six criteria can be used to choose a biodigester, namely Capacity, Input of organic matter, Input of water, Average expected production of biogas and average output of biofertilizer. The research showed that biodigester 7 had the best performance as a result. The relevance of the study refers to the contribution to the reduction of organic solid waste in the environment. This research applies THOR 2 method in a new context.Abstract in English:
ABSTRACT Research related to vehicle routing and its applications reveals the interest in supporting the authorities in solving society’s problems. In this way, a bi-objective model of mixed integer linear programming and metaheuristics based on genetic algorithm and local search, applied to reverse logistics by green open vehicle routing, is presented. The process consists of collecting solid urban waste at collection points, contributing to urban sustainability through the route plan determined by the proposal, optimizing distances and costs as well; it also measures the fuel consumption of the vehicles from their departure from the depot to the last collection point, as well as the CO2 emissions during routing. The proposals, implemented in GLPK and Python, obtain results in various scenarios. The model generates solutions quickly in scenarios between 5 and 25 collection points. For larger scenarios it does not find solutions within the time limit of 7200 seconds. The metaheuristics have greater potential as for 31 collection points, the processing time was 2.3 seconds which is a good indication for larger scenarios. Three Sectors of Trujillo city are tested to evaluate our proposal.Abstract in English:
ABSTRACT One of industrial companies’ challenges, especially for intensive-use plants and other assets, is the proper sizing of the stock of strategic spare parts, items that have a history of low consumption, but whose lack can cause delays in repair and maintenance services, at the extreme leading to operational shutdowns. Effects can be from small to large scales. While on one hand, having a large stock of strategic items can provide a greater guarantee of operational availability, on the other hand, it brings additional storage and preservation costs, in addition to fixed capital outlays. A compromise solution is needed. The use of traditional or simpler techniques to infer the ideal level of stock for each spare often suffers from lack of historical data, especially in installations in the initial phase of the operation and maintenance cycle. Another problem is the diversity of applications for some materials. The present work proposes a method based on reliability and Bayesian hierarchical models (HBMs) to overcome the problems of data scarcity, uncertainties and variability between applications of each spare part. The criticality of the equipment or assets in which the spare parts are applied is taken into account in the method. The hierarchical Bayesian model enables updating information as new consumption of strategic items is registered. The method is tested for a stationary offshore oil and gas unit.Abstract in English:
ABSTRACT This article aims to present a Multicriteria Decision Aiding (MCDA) model for assessing risk management maturity. Therefore, it is proposed to use a Maturity Model (MM) for risk management aligned with the ELECTRE TRI method. The ELECTRE TRI was chosen as the sorting method because it has a strong axiomatic structure based on the relationship of concordance and discordance between the alternative and the profile that delimits each of its classes. To test the proposal, a case study was carried out on a real company in the construction industry. For the development of the risk management maturity assessment model, a questionnaire was applied to collect data related to risk management practices in the organization. After collection, the data were used for modeling in a Decision Support System to apply the ELECTRE TRI, which managed to classify and identify the organization’s risk management maturity at level 3 (managed).Abstract in English:
ABSTRACT In this paper, we propose a math-heuristic that combines mathematical programming techniques with a heuristic approach based on the Simulated Annealing metaheuristic to solve the Two-Echelon Capacitated Vehicle Routing Problem (2E-CVRP). This problem arises in the context of a distribution network that is divided in two levels: satellite facilities that connect customers to fulfilment centers where freight originates. As it is an NP-hard problem, the proposed approach combines a cluster-first route-second math-heuristic in which approaches are more appropriate, particularly for problem sizes that are more commonly found in practice. The results of the experiments with benchmark instances show that such cluster-first route-second math-heuristic approach utilizing package solvers (CPLEX and TSP CONCORDE) can effectively help solving the CVRP for small instances when compared to an exact method. The experiments conducted on benchmark instances demonstrated the effectiveness of the proposed “cluster-first, route-second” math-heuristic approach, which utilizes package solvers such as CPLEX and TSP CONCORDE, in solving the CVRP for small instances, outperforming exact methods. This research contributes to demonstrating the potential applications of package solvers on heuristic structures for solving the CVRP. Although the presented math-heuristic has limitations, mainly due to the extensive usage of mathematical programming and the chosen characteristics of the implemented local search operators, it can quickly generate high-quality initial solutions for medium and large instances. By showcasing the “cluster-first, route-second” approach, this paper provides a framework that can be further improved by local search or embedded in other metaheuristics, such as GRASP or tabu search, and has practical implications for various industries.Abstract in English:
ABSTRACT In the Minimum Dispersion Routing Problem, a set of vertices must be served by tours with trajectories defined in order to reduce the dispersion of vehicles. Tours must be related to each other and impose a spatial and temporal synchronization among vehicles, quantified by an original dispersion metric used in an objective function to be minimized. In this paper, we demonstrate that Euclidean Traveling Salesman Problem solutions can be found by MDRP solvers. We describe a method to reduce Euclidean Traveling Salesman Problem instances to Minimum Dispersion Routing Problem instances in polynomial time, proving that the last one is NP-Hard.Abstract in English:
ABSTRACT In the present article, some fixed point theorems are investigated for two pairs of weakly compatible maps through (Ω, ∆)-type weak contractive maps in the framework of fuzzy metric spaces. The results studied in this workpiece are generalizations of some recent results existing in literature. Also, some illustrative examples are presented in last section to check the authenticity of our results.Abstract in English:
ABSTRACT This paper studies the bandwidth reduction problem for large-scale sparse matrices in serial computations. A heuristic for bandwidth reduction reorders the rows and columns of a given sparse matrix, placing entries with a non-null value as close to the main diagonal as possible. Recently, a paper proposed the FNCHC+ heuristic. The heuristic method is a variant of the Fast Node Centroid Hill-Climbing algorithm. The FNCHC+ heuristic presented better results than the other existing heuristics in the literature when applied to reduce the bandwidth of large-scale graphs (of the underline matrices) with sizes up to 18.6 million vertices and up to 57.2 million edges. The present paper provides new experiments with even larger graphs. Specifically, the present study performs experiments with test problems containing up to 24 million vertices and 130 million edges. The results confirm that the FNCHC+ algorithm is the state-of-the-art metaheuristic algorithm for reducing the bandwidth of large-scale matrices.Abstract in English:
ABSTRACT Vehicle Routing Problems (VRP) have been widely researched throughout history as a way of optimising routes by minimising distances and planning deliveries efficiently, but the issue of risk in VRP has received less attention over time. This is essential to increasing transport safety to avoid interruptions in supply chains and improve delivery reliability. Therefore, this study aims to support decision makers to plan routes for road freight transportation considering not only the logistics cost, but also travel safety due to road hazards. An analytical approach based on statistics was developed in which data of vehicle accidents and road features were used to estimate the risk cost by using the Monte Carlo simulation. A bi-objective approach based on PROMETHEE II and ε-constrained methods was used in the Capacitated Vehicle Routing Problem (CVRP) to analyse the conflict between the logistic cost and accident risk. Key contributions of this study are an analytical approach based on statistics, Monte Carlo simulation and multicriteria methods in a CVRP model to explore the trade-off between logistic costs and accident risk expressed by a risk cost. The outcomes of this study show to be useful in practice to analyse transportation decisions in the VRP model involving route planning considering accident risk.Abstract in English:
ABSTRACT The problem of finding the shortest path between a source and a destination node, commonly represented by graphs, has several computational algorithms as an attempt to find what is called the minimum path. Depending on the number of nodes in-between the source and destination, the process of finding the shortest path can demand a high computational cost (with polynomial complexity). A solution to reduce the computational cost is the use of the concept of parallelism, which divides the algorithm tasks between the processing cores. This article presents a comparative analysis of the main algorithms of the shortest path class: Dijkstra, Bellman-Ford, Floyd-Warshall and Johnson. The performance of each algorithm was evaluated considering different parallelization approaches and they were applied on general and open-pit mining databases present in the literature. The experimental results showed an improvement in performance of about 55% on the execution time depending on the chosen parallelization point.Abstract in English:
ABSTRACT Irregular strip packing problems are present in a wide variety of industrial sectors, such as the garment, footwear, furniture and metal industry. The goal is to find a layout in which an object will be cut into small pieces with minimum raw-material waste. Once a layout is obtained, it is necessary to determine the path that the cutting tool has to follow to cut the pieces from the layout. In the latter, the goal is to minimize the cutting distance (or time). Although industries frequently use this solution sequence, the trade-off between the packing and the cutting path problems can significantly impact the production cost and productivity. A layout with minimum raw-material waste, obtained through the packing problem resolution, can imply a longer cutting path compared to another layout with more material waste but a shorter cutting path, obtained through an integrated strategy. Layouts with shorter cutting path are worthy of consideration because they may improve the cutting process productivity. In this paper, both problems are solved together using a biobjective matheuristic based on the Biased Random-Key Genetic Algorithm. Our approach uses this algorithm to select a subset of the no-fit polygons edges to feed the mathematical model, which will compute the layout waste and cutting path length. Solving both strip packing and cutting path problems simultaneously allows the decision-maker to analyze the compromise between the material waste and the cutting path distance. As expected, the computational results showed the trade-off’s relevance between these problems and presented a set of solutions for each instance solved.Abstract in English:
ABSTRACT The Least Cost Influence Problem is a combinatorial optimization problem that appears in the context of social networks. The objective is to give incentives to individuals of a network, such that some information spreads to a desired fraction of the network at minimum cost. We introduce a problem-dependent algorithm in a branch-and-bound scheme to compute a dual bound for this problem. The idea is to exploit the connectivity properties of sub-graphs of the input graph associated with each node of the branch-and-bound tree and use it to increase each sub-problem’s lower bound. Our algorithm works well and finds a lower bound tighter than the LP-relaxation in linear time in the size of the graph. Computational experiments with synthetic graphs and real-world social networks show improvements in using our proposed bounds. The improvements are gains in running time or gap reduction for exact solutions to the problem.