Browsing by Subject "Operations research"
Now showing 1 - 19 of 19
Results Per Page
Sort Options
Item A Solution of One Type of The N-Job, M-Machine Sequencing Problem(Texas Tech University, 1966-05) Smith, Richard DowlenNot Available.Item An adaptive tabu search approach to cutting and packing problems(2003) Harwig, John Michael; Barnes, J. Wesley.This research implements an adaptive tabu search procedure that controls the partitioning, ordering and orientation features of a two–dimensional orthogonal packing solution, details an effective fine-grained objective function for cutting and packing problems, and presents effective move neighborhoods to find very good answers in a short period of time. Results meet or exceed all other techniques presented in literature for a common test set for all 500 instances. These techniques have natural extension into both two dimensional arbitrarily shaped and three dimensional orthogonal packing heuristics that use rules based on given partitions, orders and orientations. Techniques to extend this research into those new horizons and methods that might improve the search are also presented.Item An analysis of inter-relationships of multiple criteria in a flowshop with set-up sesquence dependence(Texas Tech University, 1976-08) Pulle, Christopher VNot availableItem An analysis of management science staffing in Texas(Texas Tech University, 1966-08) Williams, John GarlandMuch emphasis today is being placed upon management science techniques, both quantitative as uell as qualitative, as tools for assisting the business manager in reaching more effective and accurate decisions. This trend has brought about many problems that top management must solve. It is the main objective of this report to investigate just one of these problem areas, namely the problem of staffing these types of management science positions. In other words, what general types of personnel are best qualified for work in the management science field as determined by management scientists in Texas? Thus, this report will attempt to answer this question by developing a basic criterion to meet the general personnel requirements for organizing and selecting an effective, efficient management science staff.Item An evaluation of dispatching, due date, labor assignment, and input control policy decisions in a dual resource constrained job shop(Texas Tech University, 1990-12) Salegna, Gary JThis study examines the interaction of dispatching, labor, due date assignment, and input control decision variables in the dual resource constrained (DRC) job shop. The DRC job shop is notably different from the machine-limited job shop in that labor is constrained as well as machinery. The majority of job shop scheduling research has assumed that labor was always available to process any job. This assumption does not portray the actual environment of most job shops. In reality, not all machines can be manned simultaneously, and labor is both a flexible and limiting resource. In addition, few studies have looked at due date assignment, or job releasing decisions within the DRC job shop. The initial experiments examine the effectiveness of dispatching and due date assignment methods (under "loose" and "tight" due dates) in a DRC job shop. The contingent experiments are used to assess the effect of input control on the best dispatching and due date assignment policies, as determined by the initial experiments, for the DRC job shop. These operating policies are evaluated over five levels of due date tightness in the contingent experiments. The findings of the initial experiments indicate that labor has a significant interaction with dispatching and due date rules. The results of the contingent experiments aiso indicate a significant interaction between input control strategies and due date assignment methods. The implications of this study suggest that the dependencies between the decision variables--dispatching; due date; labor; and input control--must be considered in order to develop an effective scheduling policy for the DRC job shop.Item An investigation of the graph theoretic properties of the sequencing problem(Texas Tech University, 1968-06) Zurla, Charles FrancisNot availableItem An investigation of the use of artificial intelligence in solving the job shop sequencing problem.(Texas Tech University, 1975-05) Iskander, Wafik HalimNot availableItem Analysis of four stage series queueing models(Texas Tech University, 1976-08) Liu, Yiqing,Not availableItem Approximations, simulation, and accuracy of multivariate discrete probability distributions in decision analysis(2012-05) Montiel Cendejas, Luis Vicente; Bickel, J. Eric; Morton, David P.; Hasenbein, John J.; Dyer, James S.; Lake, Larry W.Many important decisions must be made without full information. For example, a woman may need to make a treatment decision regarding breast cancer without full knowledge of important uncertainties, such as how well she might respond to treatment. In the financial domain, in the wake of the housing crisis, the government may need to monitor the credit market and decide whether to intervene. A key input in this case would be a model to describe the chance that one person (or company) will default given that others have defaulted. However, such a model requires addressing the lack of knowledge regarding the correlation between groups or individuals. How to model and make decisions in cases where only partial information is available is a significant challenge. In the past, researchers have made arbitrary assumptions regarding the missing information. In this research, we developed a modeling procedure that can be used to analyze many possible scenarios subject to strict conditions. Specifically, we developed a new Monte Carlo simulation procedure to create a collection of joint probability distributions, all of which match whatever information we have. Using this collection of distributions, we analyzed the accuracy of different approximations such as maximum entropy or copula-models. In addition, we proposed several new approximations that outperform previous methods. The objective of this research is four-fold. First, provide a new framework for approximation models. In particular, we presented four new models to approximate joint probability distributions based on geometric attributes and compared their performance to existing methods. Second, develop a new joint distribution simulation procedure (JDSIM) to sample joint distributions from the set of all possible distributions that match available information. This procedure can then be applied to different scenarios to analyze the sensitivity of a decision or to test the accuracy of an approximation method. Third, test the accuracy of seven approximation methods under a variety of circumstances. Specifically, we addressed the following questions within the context of multivariate discrete distributions: Are there new approximations that should be considered? Which approximation is the most accurate, according to different measures? How accurate are the approximations as the number of random variables increases? How accurate are they as we change the underlying dependence structure? How does accuracy improve as we add lower-order assessments? What are the implications of these findings for decision analysis practice and research? While the above questions are easy to pose, they are challenging to answer. For Decision Analysis, the answers open a new avenue to address partial information, which bing us to the last contribution. Fourth, propose a new approach to decision making with partial information. The exploration of old and new approximations and the capability of creating large collections of joint distributions that match expert assessments provide new tools that extend the field of decision analysis. In particular, we presented two sample cases that illustrate the scope of this work and its impact on uncertain decision making.Item Branch-and-cut for piecewise linear optimization(2012-05) Rajat, Gupta; Farias, Ismael R. d.; Simonton, James L.; Matis, Timothy I.; Smith, Phillip; Zhang, YuanlinIn this research we report and analyze the results of our extensive testing of branch-and- cut for piecewise linear optimization using the cutting planes. We tested large instances for transshipment problem using MIP, LOG and SOS2 formulations. Besides analysis of the performance of the cuts, we also analyze the effect of formulation of the performance of branch-and-cut. These tests were conducted using callable libraries of CPLEX and GUROBI. Finally, we also analyzed the results of piecewise linear optimization problems with semi- continuous constraints.Item Ctrl.FRAME : a control-theoretical framework for resource allocation management in engineering(2011-12) Mozano, Ashton; Barber, Suzanne; Graser, ThomasThe Software Life Cycle (SLC) often comprises a complex sequence of processes, each with many subparts where various execution decisions throughout the pipeline can greatly affect the success or failure of a given project. Some of the most important decisions involve the allocation of scarce resources throughout the SLC, which are often based on estimations about future market demand and various extraneous factors of high stochasticity. Despite numerous efforts in standardization, many projects are still highly dependent on the subjective aptitude of individual managers, who may in turn rely on ad hoc techniques rather than standardized and repeatable ones. The results can be unpredictability and undue reliance on specific individuals. This paper considers imposing a mathematical framework on two of the key aspects of SLC: Deciding how to dynamically allocate available resources throughout the development pipeline, and when to stop further work on a given task in light of the associated Return On Investment (ROI) metrics. In so doing, the software development process is modeled as a problem in New Product Development (NPD) Management, which can be approached using control theory and stochastic combinatorial optimization techniques. The paper begins by summarizing some of the previous developments in these fields, and proposes some future research directions for solving complex resource allocation problems under stochastic settings. The outcome is a formal framework that when combined with competent Configuration Management techniques, can rapidly achieve near-optimal solutions at each stage of the SLC in a standardized manner.Item Investigation of workload smoothing in the performance of a dual resource constrained job shop(Texas Tech University, 1992-08) Murray, Mary Ann SumstadThis study examined the effect of workload smoothing on the performance of a dual resource constrained (DRC) job shop utilizing an integrated system. A simulation study was conducted in two stages. The primary experiments evaluated the effect of four smoothing rules, three order review/release (ORR) rules, and six dispatching rules at a 90% shop utilization level on six performance measures. The second set of experiments provided a sensitivity analysis using the four smoothing rules, three ORR rules, and the four best-performing dispatching rules from the primary experiments at the 85% shop utilization level. Analysis of Variance, ranked data, and interaction plots were examined to determine which policies provided the best results for both sets of experiments. Results of this research indicate the following. (1) Workload smoothing had no significant effect on the performance of the DRC job shop. At the 90% shop utilization level, the ORR provided the necessary control using the Maximum Shopload (MSL) release mechanism, except for the performance measures of standard deviation of flowtime and percent jobs tardy where the Immediate Release (IMR) rule performed better. At the 85% shop utilization level, IMR performed better and the primary decision became the dispatching rule to select, except for the performance measure of standard deviation of flowtime. With standard deviation of flowtime, extreme smoothing and MSL performed better with either the Earliest Due Date (EDD) or Modified Due Date (MDD) dispatching rules. It appears more control is required at the lower utilization level to reduce the variation of individual jobs about the mean flowtime. (2) The Maximum Jobload (MJL) release rule designed for this study performed as well as, but no better than, the MSL rule. (3) The "crossover phenomenon" was not an issue as those rules that did exhibit evidence of crossing over were either rules which were not candidates for consideration in policy decisions, or members of a group of rules that were not significantly different from each other and were considered as a group.Item Minimal cost flow problem with overflow node penalty functions(Texas Tech University, 1976-05) Hui, Yer VanNot availableItem Optimization of production allocation under price uncertainty : relating price model assumptions to decisions(2011-08) Bukhari, Abdulwahab Abdullatif; Jablonowski, Christopher J.; Lasdon, Leon S.; Dyer, James S.Allocating production volumes across a portfolio of producing assets is a complex optimization problem. Each producing asset possesses different technical attributes (e.g. crude type), facility constraints, and costs. In addition, there are corporate objectives and constraints (e.g. contract delivery requirements). While complex, such a problem can be specified and solved using conventional deterministic optimization methods. However, there is often uncertainty in many of the inputs, and in these cases the appropriate approach is neither obvious nor straightforward. One of the major uncertainties in the oil and gas industry is the commodity price assumption(s). This paper investigates this problem in three major sections: (1) We specify an integrated stochastic optimization model that solves for the optimal production allocation for a portfolio of producing assets when there is uncertainty in commodity prices, (2) We then compare the solutions that result when different price models are used, and (3) We perform a value of information analysis to estimate the value of more accurate price models. The results show that the optimum production allocation is a function of the price model assumptions. However, the differences between models are minor, and thus the value of choosing the “correct” price model, or similarly of estimating a more accurate model, is small. This work falls in the emerging research area of decision-oriented assessments of information value.Item Prioritization via stochastic optimization(2010-05) Koc, Ali; Popova, Elmira; Morton, David P.; Bard, Jonathan; Caramanis, Constantine; Hess, Stephen; Kutanoglu, ErhanWe take a novel perspective on real-life decision making problems involving binary activity-selection decisions that compete for scarce resources. The current literature in operations research approaches these problems by forming an optimal portfolio of activities that meets the specified resource constraints. However, often practitioners in industry and government do not take the optimal-portfolio approach. Instead, they form a rank-ordered list of activities and select those that have the highest priority. The academic literature tends to discredit such ranking schemes because they ignore dependencies among the activities. Practitioners, on the other hand, sometimes discredit the optimal-portfolio approach because if the problem parameters change, the set of activities that was once optimal no longer remains optimal. Even worse, the new optimal set of activities may exclude some of the previously optimal activities, which they may have already selected. Our approach takes both viewpoints into account. We rank activities considering both the uncertainty in the problem parameters and the optimal portfolio that will be obtained once the uncertainty is revealed. We use stochastic integer programming as a modeling framework. We develop several mathematical formulations and discuss their relative merits, comparing them theoretically and computationally. We also develop cutting planes for these formulations to improve computation times. To be able to handle larger real-life problem instances, we develop parallel branch-and-price algorithms for a capital budgeting application. Specifically, we construct a column-based reformulation, develop two branching strategies and a tabu search-based primal heuristic, propose two parallelization schemes, and compare these schemes on parallel computing environments using commercial and open-source software. We give applications of prioritization in facility location and capital budgeting problems. In the latter application, we rank maintenance and capital-improvement projects at the South Texas Project Nuclear Operating Company, a two-unit nuclear power plant in Wadsworth, Texas. We compare our approach with several ad hoc ranking schemes similar to those used in practice.Item Random sampling in flowshop scheduling(Texas Tech University, 1987-05) Kumar, Krishna SNot availableItem Scheduling flowshops with limited in-process wait(Texas Tech University, 1992-12) Liou, Jin-pinThe flowshop problem has been studied for decades since Johnson published his two-machine and three-machine optimal algorithms in 1954. The static deterministic permutation flowshop problem is one of scheduling N jobs in a shop containing M machines, where each job with fixed processing times has to be processed on every machine, each job is processed on one machine at a time without preemption, and each job follows the same ordering of machines as it is processed. The above flowshop problem is identified as the conventional flowshop problem. This research involves the problem of scheduling a new static deterministic permutation flowshop in which, after processing on a machine, each job has to be processed by the next machine before an allowable waiting time is passed. The objective is to minimize the makespan. This new flowshop problem is identified as the limited in-process wait flowshop problem. The flowshop scheduling problem in this study is found to be NP-complete. A branch and bound algorithm based on a composite bound is presented, and 18 heuristic algorithms are proposed. The heuristics are compared by the nonparametric Friedman test in terms of makespan, average job waiting time, and CPU time consumed, and a sensitivity analysis on the relative performance of the heuristics is conducted. The research shows that the heuristics which use simulated annealing or tabu search technique usually have the best performance in minimizing the makespan, but the disadvantage is that large computation efforts are necessary.Item Scheduling of flow shop problems with finite intermediate storage(Texas Tech University, 1989-05) Azim, Muhammad A.Not availableItem Sequential application of simple scheduling rules(Texas Tech University, 1976-05) Shue, Li-yenNot available