Browsing by Subject "Simulation"
Now showing 1 - 20 of 107
Results Per Page
Sort Options
Item A column based variance analysis approach to static reservoir model upgridding(Texas A&M University, 2008-10-10) Talbert, Matthew BrandonThe development of coarsened reservoir simulation models from high resolution geologic models is a critical step in a simulation study. The optimal coarsening sequence becomes particularly challenging in a fluvial channel environment where the channel sinuosity and orientation can result in pay/non-pay juxtaposition in many regions of the geologic model. The optimal coarsening sequence is also challenging in tight gas sandstones where sharp changes between sandstone and shale beds are predominant and maintaining the pay/non-pay distinction is difficult. Under such conditions, a uniform coarsening will result in mixing of pay and non-pay zones and will likely result in geologically unrealistic simulation models which create erroneous performance predictions. In particular, the upgridding algorithm must keep pay and non-pay zones distinct through a non-uniform coarsening of the geologic model. We present a coarsening algorithm to determine an optimal reservoir simulation grid by grouping fine scale geologic model cells into effective simulation cells. Our algorithm groups the layers in such a way that the heterogeneity measure of an appropriately defined static property is minimized within the layers and maximized between the layers. The optimal number of layers is then selected based on an analysis resulting in a minimum loss of heterogeneity. We demonstrate the validity of the optimal gridding by applying our method to a history matched waterflood in a structurally complex and faulted offshore turbiditic oil reservoir. The field is located in a prolific hydrocarbon basin offshore South America. More than 10 years of production data from up to 8 producing wells are available for history matching. We demonstrate that any coarsening beyond the degree indicated by our analysis overly homogenizes the properties on the simulation grid and alters the reservoir response. An application to a tight gas sandstone developed by Schlumberger DCS is also used in our verification of our algorithm. The specific details of the tight gas reservoir are confidential to Schlumberger's client. Through the use of a reservoir section we demonstrate the effectiveness of our algorithm by visually comparing the reservoir properties to a Schlumberger fine scale model.Item A Process Integration Approach to the Strategic Design and Scheduling of Biorefineries(2011-02-22) Elms, Rene ?DavinaThis work focused upon design and operation of biodiesel production facilities in support of the broader goal of developing a strategic approach to the development of biorefineries. Biodiesel production provided an appropriate starting point for these efforts. The work was segregated into two stages. Various feedstocks may be utilized to produce biodiesel, to include virgin vegetable oils and waste cooking oil. With changing prices, supply, and demand of feedstocks, a need exists to consider various feedstock options. The objective of the first stage was to develop a systematic procedure for scheduling and operation of flexible biodiesel plants accommodating a variety of feedstocks. This work employed a holistic approach and combination of process simulation, synthesis, and integration techniques to provide: process simulation of a biodiesel plant for various feedstocks, integration of energy and mass resources, optimization of process design and scheduling, and techno-economic assessment and sensitivity analysis of proposed schemes. An optimization formulation was developed to determine scheduling and operation for various feedstocks and a case study solved to illustrate the merits of the devised procedure. With increasing attention to the environmental impact of discharging greenhouse gases (GHGs), there has been growing public pressure to reduce the carbon footprint associated with fossil fuel use. In this context, one key strategy is substitution of fossil fuels with biofuels such as biodiesel. Design of biodiesel plants has traditionally been conducted based on technical and economic criteria. GHG policies have the potential to significantly alter design of these facilities, selection of feedstocks, and scheduling of multiple feedstocks. The objective of the second stage was to develop a systematic approach to design and scheduling of biodiesel production processes while accounting for the effect of GHG policies. An optimization formulation was developed to maximize profit of the process subject to flowsheet synthesis and performance modeling equations. The carbon footprint is accounted for through a life cycle analysis (LCA). The objective function includes a term reflecting the impact of the LCA of a feedstock and its processing to biodiesel. A multiperiod approach was used and a case study solved with several scenarios of feedstocks and GHG policies.Item A Quasi-Dynamic HVAC and Building Simulation Methodology(2012-07-16) Davis, Clinton PaulThis thesis introduces a quasi-dynamic building simulation methodology which complements existing building simulators by allowing transient models of HVAC (heating, ventilating and air-conditioning) systems to be created in an analogous way to their design and simulated in a computationally efficient manner. The methodology represents a system as interconnected, object-oriented sub-models known as components. Fluids and their local properties are modeled using discrete, incompressible objects known as packets. System wide pressure and flow rates are modeled similar to electrical circuit models. Transferring packets between components emulates fluid flow, while the system wide fluid circuit formed by the components' interconnections determines system wide pressures and flow rates. A tool named PAQS, after the PAacketized Quasi-dynamic Simulation methodology, was built to demonstrate the described methodology. Validation tests of PAQS found that its steady state energy use predictions differed less than 3% from a comparable steady state model. PAQS was also able to correctly model the transient behavior of a dynamic linear analytical system.Item A simulation model of Rio Grande wild turkey dynamics in the Edwards Plateau of Texas(Texas A&M University, 2006-08-16) Schwertner, Thomas WayneI investigated the effect of precipitation and predator abundance on Rio Grande wild turkey (Meleagris gallopavo; RGWT) in Texas. My results suggested that RGWT production was strongly correlated with cumulative winter precipitation over the range of the RGWT in Texas. However, I found no evidence that predator abundance influenced RGWT production, although spatial-asynchrony of predator populations at multiple spatial scales might have masked broad-scale effects. Using the results of these analyses, as well as empirical data derived from the literature and from field studies in the southern Edwards Plateau, I developed a stochastic, density-dependent, sex- and agespecific simulation model of wild turkey population dynamics. I used the model to evaluate the effect of alternative harvest management strategies on turkey populations. Sensitivity analysis of the model suggested that shape of the density-dependence relationship, clutch size, hatchability, juvenile sex ratio, poult survival, juvenile survival, and nonbreeding hen mortality most strongly influenced model outcome. Of these, density-dependence, sex ratio, and juvenile survival were least understood and merit further research. My evaluation of fall hen harvest suggested that current rates do not pose a threat to turkey populations. Moreover, it appears that hen harvest can be extended to other portions of the RGWT range without reducing turkey abundance, assuming that population dynamics and harvest rates are similar to those in the current fall harvest zone. Finally, simulation of alternative hen harvest rates suggested that rates ≥5% of the fall hen population resulted in significant declines in the simulated population after 25 years, and rates ≥15% resulted in significant risk of extinction to the simulated population.Item A simulation study to verify Stone's simultaneous water and gas injection performance in a 5-spot pattern(Texas A&M University, 2008-10-10) Barnawi, Mazen TaherWater alternating gas (WAG) injection is a proven technique to enhance oil recovery. It has been successfully implemented in the field since 1957 with recovery increase in the range of 5-10% of oil-initially-in-place (OIIP). In 2004, Herbert L. Stone presented a simultaneous water and gas injection technique. Gas is injected near the bottom of the reservoir and water is injected directly on top at high rates to prevent upward channeling of the gas. Stone's mathematical model indicated the new technique can increase vertical sweep efficiency by 3-4 folds over WAG. In this study, a commercial reservoir simulator was used to predict the performance of Stone's technique and compare it to WAG and other EOR injection strategies. Two sets of relative permeability data were considered. Multiple combinations of total injection rates (water plus gas) and water/gas ratios as well as injection schedules were investigated to find the optimum design parameters for an 80 acre 5-spot pattern unit. Results show that injecting water above gas may result in better oil recovery than WAG injection though not as indicated by Stone. Increase in oil recovery with SSWAG injection is a function of the gas critical saturation. The more gas is trapped in the formation, the higher oil recovery is obtained. This is probably due to the fact that areal sweep efficiency is a more dominant factor in a 5-spot pattern. Periodic shut-off of the water injector has little effect on oil recovery. Water/gas injection ratio optimization may result in a slight increase in oil recovery. SSWAG injection results in a steady injection pressure and less fluctuation in gas production rate compared to WAG injection.Item Accounting for reservoir uncertainties in the design and optimization of chemical flooding processes(2012-08) Rodrigues, Neil; Delshad, Mojdeh; Pope, Gary A.Chemical Enhanced Oil Recovery methods have been growing in popularity as a result of the depletion of conventional oil reservoirs and high oil prices. These processes are significantly more complex when compared to waterflooding and require detailed engineering design before field-scale implementation. Coreflood experiments that have been performed on reservoir rock are invaluable for obtaining parameters that can be used for field-scale flooding simulations. However, the design used in these floods may not always scale to the field due to heterogeneities, chemical retention, mixing and dispersion effects. Reservoir simulators can be used to identify an optimum design that accounts for these effects but uncertainties in reservoir properties can still cause poor project results if it not properly accounted for. Different reservoirs will be investigated in this study, including more unconventional applications of chemical flooding such as a 3md high-temperature, carbonate reservoir and a heterogeneous sandstone reservoir with very high initial oil saturation. The goal of the research presented here is to investigate the impact that select reservoir uncertainties can have on the success of the pilot and to propose methods to reduce the sensitivity to these parameters. This research highlights the importance of good mobility control in all the case studies, which is shown to have a significant impact on the economics of the project. It was also demonstrated that a slug design with good mobility control is less sensitive to uncertainties in the relative permeability parameters. The research also demonstrates that for a low-permeability reservoir, surfactant propagation can have a significant impact on the economics of a Surfactant-Polymer Flood. In addition to mobilizing residual oil and increasing oil recovery, the surfactant enhances the relative permeability and this has a significant impact on increasing the injectivity and reducing the project life. Injecting a high concentration of surfactant also makes the design less sensitive to uncertainties in adsorption. Finally, it was demonstrated that for a heterogeneous reservoir with high initial oil saturation, optimizing the salinity gradient will significantly increase the oil recovery and will also make the process less sensitive to uncertainties in the cation exchange capacity.Item Adequate description of heavy oil viscosities and a method to assess optimal steam cyclic periods for thermal reservoir simulation(Texas A&M University, 2006-08-16) Mago, Alonso LuisA global steady increase of energy consumption coupled with the decline of conventional oil resources points to a more aggressive exploitation of heavy oil. Heavy oil is a major source of energy in this century with a worldwide base reserve exceeding 2.5 trillion barrels. Management decisions and production strategies from thermal oil recovery processes are frequently based on reservoir simulation. A proper description of the physical properties, particularly oil viscosity, is essential in performing reliable modeling studies of fluid flow in the reservoir. We simulated cyclic steam injections on the highly viscous Hamaca oil, with a viscosity of over 10,000 cp at ambient temperature, and the production was drastically impacted by up to an order of magnitude when using improper mixing rules to describe the oil viscosity. This thesis demonstrates the importance of these mixing rules and alerts reservoir engineers to the significance of using different options simulators have built in their platforms to describe the viscosity of heavy oils. Log linear and power mixing rules do not provide enough flexibility to describe the viscosity of extra heavy oil with temperature. A recently implemented mixing rule in a commercial simulator has been studied providing satisfactory results. However, the methodology requires substantial interventions, and cannot be automatically updated. We provide guidelines to improve it and suggest more flexible mixing rules that could easily be implemented in commercial simulators. We also provide a methodology to determine the adequate time for each one of the periods in cyclic steam injection: injection, soaking and production. There is a lot of speculation in this matter and one of the objectives of this thesis is to better understand and provide guidelines to optimize oil production using proper lengths in each one of these periods. We have found that the production and injection periods should be similar in time length. Nevertheless, the production period should not be less than the injection period. On the other hand, the soaking period should be as short as possible because it is unproductive time in terms of field oil production for the well and therefore it translates into a negative cash flow for a company.Item An Energy Analysis Of A Large, Multipurpose Educational Building In A Hot Climate(2012-02-14) Kamranzadeh, VahidehIn this project a steady-state building load for Constant Volume Terminal Reheat (CVTR), Dual Duct Constant Volume (DDCV) and Dual Duct Variable Air Volume (DDVAV) systems for the Zachry Engineering Building has been modeled. First, the thermal resistance values of the building structure have been calculated. After applying some assumptions, building characteristics were determined and building loads were calculated using the diversified loads calculation method. By having the daily data for six months for the Zachry building, the input to the CVTR, DDCV and DDVAV Microsoft Excel code were prepared for starting the simulation. The air handling units for the Zachry building are Dual Duct Variable Air Volume (DDVAV) systems. The calibration procedure has been used to compare the calibration signatures with characteristic signatures in order to determine which input variables need to be changed to achieve proper calibration. Calibration signatures are the difference between measured energy consumption and simulated energy consumption as a function of temperature. Characteristic signatures are the energy consumption as a function of temperature obtained by changing the value of input variables of the system. The base simulated model of the DDVAV system has been changed according to the characteristic signatures of the building and adjusted to get the closest result to the measured data. The simulation method for calibration could be used for energy audits, improving energy efficiency, and fault detection. In the base model of DDVAV, without any changes in the input, the chilled water consumption had an Root Mean Square Error (RMSE) of 56.705577 MMBtu/day and an Mean Bias Error (MBE) of 45.763256 MMBtu/day while hot water consumption had an RMSE of 1.9072574 MMBtu/day and an MBE of 45.763256 MMBtu/day. In the calibration process, system parameters such as zone temperature, cooling coil temperature, minimum supply air and minimum outdoor air have been changed. The decisions for varying the parameters were based on the characteristic signatures provided in the project. After applying changes to the system parameters, RMSE and MBE for both hot and cold water consumption were significantly reduced. After changes were applied, chilled water consumption had an RMSE of 12.749868 MMBtu/day and an MBE of 3.423188 MMBtu/day, and hot water consumption had an RMSE of 1.6790 MMBtu/day and an MBE 0.12513 of MMBtu/day.Item Analysis and synthesis of bipedal humanoid movement : a physical simulation approach(2013-08) Cooper, Joseph L., 1980-; Ballard, Dana H. (Dana Harry), 1946-Advances in graphics and robotics have increased the importance of tools for synthesizing humanoid movements to control animated characters and physical robots. There is also an increasing need for analyzing human movements for clinical diagnosis and rehabilitation. Existing tools can be expensive, inefficient, or difficult to use. Using simulated physics and motion capture to develop an interactive virtual reality environment, we capture natural human movements in response to controlled stimuli. This research then applies insights into the mathematics underlying physics simulation to adapt the physics solver to support many important tasks involved in analyzing and synthesizing humanoid movement. These tasks include fitting an articulated physical model to motion capture data, modifying the model pose to achieve a desired configuration (inverse kinematics), inferring internal torques consistent with changing pose data (inverse dynamics), and transferring a movement from one model to another model (retargeting). The result is a powerful and intuitive process for analyzing and synthesizing movement in a single unified framework.Item Analysis of HMA permeability through microstructure characterization and simulation of fluid flow in X-ray CT images(Texas A&M University, 2005-02-17) Al Omari, Aslam Ali MuflehThe infiltration of water in asphalt pavements promotes moisture damage primarily through damaging the binder cohesive bond and the adhesive bond between aggregates and binder. Moisture damage is associated with excessive deflection, cracking, and rutting. The first step in addressing the problems caused by the presence of water within pavement systems is quantifying the permeability of hot mix asphalt (HMA) mixes. This dissertation deals with the development of empirical-analytical and numerical approaches for predicting the permeability of HMA. Both approaches rely on the analysis of air void distribution within the HMA microstructure. The empirical-analytical approach relies on the development of modified forms of the Kozeny-Carman equation and determining the material properties involved in this equation through three dimensional microstructure analyses of X-ray Computed Tomography (CT) images. These properties include connected percent air voids (effective porosity), tortuosity, and air void specific surface area. A database of materials and permeability measurements was used to verify the developed predicting equation. The numerical approach, which is the main focus of this study, includes the development of a finite difference numerical simulation model to simulate the steady incompressible fluid flow in HMA. The model uses the non-staggered system that utilizes only one cell to solve for all governing equations, and it is applicable for cell Reynolds number (Rec) values that are not restricted by |Rec|≤2. The validity of the numerical model is verified through comparisons with closed-form solutions for idealized microstructure. The numerical model was used to find the components of the three-dimensional (3-D) permeability tensor and permeability anisotropy values for different types of HMA mixes. It was found that the principal permeability directions values are almost in the horizontal and vertical directions with the maximum permeability being in the horizontal direction.Item Analysis, design and implementation of models for housestaff scheduling at outpatient clinics and improving patient flow at a family health clinic(2015-05) Shu, Zhichao, Ph. D.; Bard, Jonathan F.; Morrice, Douglas J. (Douglas John), 1962-; Khajavirad, Aida; Dimitrov, Ned; Leykum, LuciClinical experiences during the three years of residencies occur in inpatient and outpatient settings on generalist and specialist clinical services. Housestaff rotate through different clinical experiences monthly, with their primary care clinic time overlaid longitudinally on these other clinical services. The primary goals of this research are to construct housestaff schedules and improve efficiencies for residency programs. In the first phase of the research, we developed two models for constructing monthly clinic schedules for housestaff training in Internal Medicine. In our first model, the objective is to both maximize clinic utilization and minimize the number of violations of a prioritized set of goals while ensuring that certain clinic-level and individual constraints are satisfied. The corresponding problem is formulated as an integer goal program in which several of the hard constraints are temporarily allowed to be violated to avoid infeasibility. A three-phase methodology is then proposed to find solutions. The second model solves a similar problem with the objective of maximizing the number of interns and residents that are assigned clinic duty each month during their training in Internal Medicine. A complexity analysis is provided that demonstrates that the basic problem can be modeled as a pure network and the full problem can be modeled as a network with gains. In the second phase of the research, the goal was to redesign the monthly templates that comprise the annual block rotations to obtain better housestaff schedules. To implement this model, we investigate two different programs: Family Medicine and Internal Medicine. The problems were formulated as mixed-integer programs but proved too difficult to solve exactly. As an alternative, several heuristics were developed that yielded good feasible solutions. For the last part of the research, we focused on improving patient flow at a family health clinic. The objective was to obtain a better understanding of patient flow through the clinic and to investigate changes to current scheduling rules and operating procedures. Discrete event simulation was used to establish a baseline and to evaluate a variety of scenarios associated with appointment scheduling and managing early and late arrivals.Item Analysis-ready models of tortuous, tightly packed geometries(2013-08) Edwards, John Martin; Bajaj, ChandrajitComplex networks of cells called neurons in the brain enable human learning and memory. The topology and electrophysiological function of these networks are affected by nano and microscale geometries of neurons. Understanding of these structure-function relationships in neurons is an important component of neuroscience in which simulation plays a fundamental role. This thesis addresses four specific geometric problems raised by modeling and simulation of intricate neuronal structure and behavior at the nanoscale. The first two problems deal with 3D surface reconstruction: neurons are geometrically complex structures that are tightly intertwined in the brain, presenting great challenges in reconstruction. We present the first algorithm that reconstructs surface meshes from polygonal contours that provably guarantees watertight, manifold, and intersection-free forests of densely packed structures. Many algorithms exist that produce surfaces from cross-sectional contours, but all either use heuristics in fitting the surface or they fail when presented with tortuous objects in close proximity. Our algorithm reconstructs surfaces that are not only internally correct, but are also free of intersections with other reconstructed objects in the same region. We also present a novel surface remeshing algorithm suitable for models of neuronal dual space. The last two problems treated by this thesis deal with producing derivative models from surface meshes. A range of neuronal simulation methodologies exist and we offer a framework to derive appropriate models for each from surface meshes. We present two specific algorithms that yield analysis-ready 1D cable models in one case, and proposed "aligned cell" models in the other. In the creation of aligned cells we also present a novel adaptive distance transform. Finally, we present a software package called VolRoverN in which we have implemented many of our algorithms and which we expect will serve as a repository of important tools for the neuronal modeling community. Our algorithms are designed to meet the immediate needs of the neuroscience community, but as we show in this thesis, they are general and suitable for a variety of applications.Item Anisotropic hybrid turbulence modeling with specific application to the simulation of pulse-actuated dynamic stall control(2015-12) Haering, Sigfried William; Moser, Robert deLancey; Murthy, Jayathi; Bogard, David G; Ezekoye, Ofodike A; Oliver, ToddExperimental studies have shown pulse actuated dynamic stall control may provide a simple means to significantly increase the performance of lifting surfaces and expand their flight envelope. However, precise information of the complex boundary layer reattachment mechanisms are inaccessible to experimental measurements. Therefore, simulations are necessary to fully understand, optimize, and apply this method. Due to the inherent shortcomings of RANS, computational expense of LES, and deficiencies in current hybrid modeling approaches, a new hybrid modeling framework has been developed. Based in using the two-point second-order structure function to drive a local equilibrium between resolved and modeled turbulence, the new approach addresses issues associated with inhomogeneous and anisotropic grids as well as the treatment of the RANS/LES interface in hybrid simulations. Numerical studies using hybrid RANS/LES modeling approaches of a stalled airfoil with spanwise-uniform actuation regions experiencing single pulse actuated flow reattachment have been performed. The mechanism responsible for reattachment has been identified as a repeating wall-vortex interaction process. The new hybrid framework and anisotropic SGS models developed here are anticipated to be of great benefit well beyond the focus of this work with application to many challenging flow situations of pressing engineering interest.Item Assessing personality using a virtual simulation : a research proposal(2011-05) Quick, Daniel Ryan; Sherry, Alissa René; Schallert, DianeOne of the primary goals of personality assessment is to provide meaningful information regarding an individual’s characteristic way of thinking, feeling, and behaving. Given the interaction between the individual and the context, however, there is much debate as to how well personality tests do what they intend. In this paper, the limitations of text-based personality assessments are examined, and the use of virtual simulations as an alternative to conventional tests is explored. A research study is proposed comparing a virtual test with a written test on a variety of criteria. Modern technology and the growing popularity of gaming suggest that researchers may find virtual simulations as a more immersive, flexible, and accurate forms of assessment.Item Assessing the Value of Delay to Truckers and Carriers(2011-02-22) Miao, QingThis thesis evaluates the Value of Delay (VOD) to commercial vehicle operators due to highway congestions. The VOD for congestion is a fundamental parameter driving the private sectors? response to public freight projects and policies such as corridor construction and tolling. Factors affecting the commercial VOD include direct operational cost, travel length, travel time variation, inventory holding, and warehouse management. To approach the VOD, two methods are adopted in this thesis. One is the Stated Preference (SP) survey. The other is carrier fleet operational simulation. The simulation framework uses ArcGIS, and C . ArcGIS is used to generate a freight network based on the Houston, TX highway system. A set of customers are randomly generated, each having a random demand for service, which is associated with time windows for delivery and pickup. A heuristic algorithm is proposed to dispatch vehicles for truckload service on a continuous time horizon. The average VOD is then obtained through the ratio between additional operational cost and the delay caused by the congestion. This ratio is assessed in two scenarios: single depot and two cooperating depots. Different tests based on demand size, demand distribution pattern, time window and location of congestion are conducted. Simulation shows a range of VOD from $93.99/hr to $120.89/hr for the case of a central depot and $79.81/hr to $83.81/hr for the case of two depots. In addition, a SP survey is conducted for truckers and carriers in two scenarios. The first scenario assumes a driver running late by 30 minutes on a congested road, while the second scenario assumes an on-time delivery or pickup. Several tolling alternatives are assumed to test the driver?s willingness to pay for using a hypothetical toll road. The data is then regressed with the logit model using maximum likelihood estimation to obtain perspective value of delay. A generic utility function is adopted, which results in a VOD range from $24.72/hr to $64.99/hr. A comparison between the survey and the simulation results shows that drivers perceive a significantly lower VOD than the simulated VOD in freight operation.Item Beauty waves: an artistic representation of ocean waves using Bezier curves(Texas A&M University, 2007-04-25) Faulkner, Jay AllenIn this thesis, we present a method for computing an artistic representation of ocean waves using Bezier curves. Wave forms are loosely based on procedural wave models and are designed to emulate those found in both art and nature. The wave forms are generated using a slice method which is user defined by structured input, thus providing the artist with full control over crest shape and placement. Wave propagation is obtained by interpolating between defined crest shapes and positions. We also present a method for computing a stylized representation of breaking crests in shallow water. Artists may use our model to create many interesting wave forms, including basic sinusoidal waves and waves with breaking crests that have a rotation that is cyclical in time. The major drawbacks to our solution are that data entry can be tedious and it can be difficult to produce waves that animate with a natural appearance.Item Border Crossing Modeling and Analysis: A Non-Stationary Dynamic Reallocation Methodology For Terminating Queueing Systems(2012-10-19) Moya, HiramThe United States international land boundary is a volatile, security intense area. In 2010, the combined trade was $918 billion within North American nations, with 80% transported by commercial trucks. Over 50 million commercial vehicles cross the Texas/Mexico border every year, not including private vehicles and pedestrian traffic, between Brownsville and El Paso, Texas, through one of over 25 major border crossings called "ports of entry" (POE). Recently, securing our southwest border from terrorist interventions, undocumented immigrants, and the illegal flow of drugs and guns has dominated the need to efficiently and effectively process people, goods and traffic. Increasing security and inspection requirements are seriously affecting transit times. Each POE is configured as a multi-commodity, prioritized queueing network which rarely, if ever, operates in steady-state. Therefore, the problem is about finding a balance between a reduction of wait time and its variance, POE operation costs, and the sustainment of a security level. The contribution of the dissertation is three-fold. The first uses queueing theory on the border crossing process to develop a methodology that decreases border wait times without increasing costs or affecting security procedures. The outcome is the development of the Dynamic Reallocation Methodology (DRM). Currently at the POE, inspection stations are fixed and can only inspect one truck type, FAST or Non-FAST program participant. The methodology proposes moveable servers that once a threshold is met, can be switched to service the other type of truck. Particular emphasis is given to inspection (service) times under time-varying arrivals (demands). The second contribution is an analytical model of the POE, to analyze the effects of the DRM. First assuming a Markovian service time, DRM benefits are evaluated. However, field data and other research suggest a general distribution for service time. Therefore, a Coxian k-phased approximation is implemented. The DRM is analyzed under this new baseline using expected number in the system, and cycle times. A variance reduction procedure is also proposed and evaluated under DRM. Results show that queue length and wait time is reduced 10 to 33% depending on load, while increasing FAST wait time by less than three minutes.Item Coarse scale simulation of tight gas reservoirs(Texas A&M University, 2004-09-30) El-Ahmady, Mohamed HamedIt is common for field models of tight gas reservoirs to include several wells with hydraulic fractures. These hydraulic fractures can be very long, extending for more than a thousand feet. A hydraulic fracture width is usually no more than about 0.02 ft. The combination of the above factors leads to the conclusion that there is a need to model hydraulic fractures in coarse grid blocks for these field models since it may be impractical to simulate these models using fine grids. In this dissertation, a method was developed to simulate a reservoir model with a single hydraulic fracture that passes through several coarse gridblocks. This method was tested and a numerical error was quantified that occurs at early time due to the use of coarse grid blocks. In addition, in this work, rules were developed and tested on using uniform fine grids to simulate a reservoir model with a single hydraulic fracture. Results were compared with the results from simulations using non-uniform fine grids.Item A collaborative approach to IR evaluation(2014-05) Sheshadri, Aashish; Grauman, Kristen Lorraine, 1979-; Lease, Matthew A.In this thesis we investigate two main problems: 1) inferring consensus from disparate inputs to improve quality of crowd contributed data; and 2) developing a reliable crowd-aided IR evaluation framework. With regard to the first contribution, while many statistical label aggregation methods have been proposed, little comparative benchmarking has occurred in the community making it difficult to determine the state-of-the-art in consensus or to quantify novelty and progress, leaving modern systems to adopt simple control strategies. To aid the progress of statistical consensus and make state-of-the-art methods accessible, we develop a benchmarking framework in SQUARE, an open source shared task framework including benchmark datasets, defined tasks, standard metrics, and reference implementations with empirical results for several popular methods. Through the development of SQUARE we propose a crowd simulation model that emulates real crowd environments to enable rapid and reliable experimentation of collaborative methods with different crowd contributions. We apply the findings of the benchmark to develop reliable crowd contributed test collections for IR evaluation. As our second contribution, we describe a collaborative model for distributing relevance judging tasks between trusted assessors and crowd judges. Based on prior work's hypothesis of judging disagreements on borderline documents, we train a logistic regression model to predict assessor disagreement, prioritizing judging tasks by expected disagreement. Judgments are generated from different crowd models and intelligently aggregated. Given a priority queue, a judging budget, and a ratio for expert vs. crowd judging costs, critical judging tasks are assigned to trusted assessors with the crowd supplying remaining judgments. Results on two TREC datasets show significant judging burden can be confidently shifted to the crowd, achieving high rank correlation and often at lower cost vs. exclusive use of trusted assessors.Item Comparing item selection methods in computerized adaptive testing using the rating scale model(2016-08) Butterfield, Meredith Sibley; Dodd, Barbara Glenzing; Whittaker, Tiffany A; Casabianca-Marshall, Jodi M; Hersh, Matthew AComputer Adaptive Testing (CAT), a form of computer-based testing that selects and administers items that match the examinee’s trait levels, can be shorter in length and maintain comparable or greater measurement precision than traditional fixed-length paper-and-pencil testing. Administration of computer-based patient reported outcome (PRO) measures has increased recently in the medical field. Because PRO measures often have small item pools, small numbers of items administered, and populations in poor health, the benefits of CATs are especially advantageous. In CAT, Maximum Fisher information (MFI) is the most commonly used item selection procedure since it is easy to use and computationally simple. However, its main drawback is the attenuation paradox. If the estimated trait level of the examinee is not the true trait level, the items selected will not maximize information at the true trait level and the measurement is less precise. To address this issue, alternative item selections methods have been proposed. In studies, these alternatives have not performed better than MFI. Recently, Gradual Maximum Information Ratio (GMIR) item selection method was proposed and previous findings suggest GMIR could be beneficial for a short CAT. This simulation study compared GMIR and MFI item selection methods under conditions specific to the constraints of the PRO measures. GMIR and MFI are compared under Andrich’s Rating Scale Model (ARSM) across two polytomous item pool sizes (41 and 82), two population latent trait distributions (normal and negatively skewed), and three combination maximum number of item and minimum standard error stopping rules (5/0.54, 7/0.46, 9/0.40). The conditions were fully crossed. Performance was evaluated in terms of descriptive statistics of the final trait estimates, measurement precision, conditional measurement precision, and administration efficiency. Results found GMIR had better measurement precision when the test length was 5 items, with higher mean correlations between known and estimated trait levels, smaller mean bias, and smaller mean RMSE. An effect of item pool size and population latent trait distribution was not found. Across item selection methods, measurement precision increased as the test length increase, but with diminishing returns from 7 to 9 items.