Browsing by Subject "Network"
Now showing 1 - 13 of 13
Results Per Page
Sort Options
Item Automatic semiconductor wafer map defect signature detection using a neural network classifier(2010-12) Radhamohan, Ranjan Subbaraya; Ghosh, Joydeep; El-Hamdi, MohamedThe application of popular image processing and classification algorithms, including agglomerative clustering and neural networks, is explored for the purpose of grouping semiconductor wafer defect map patterns. Challenges such as overlapping pattern separation, wafer rotation, and false data removal are examined and solutions proposed. After grouping, wafer processing history is used to automatically determine the most likely source of the issue. Results are provided that indicate these methods hold promise for wafer analysis applications.Item Causal Network Methods for Integrated Project Portfolio Risk Analysis(2014-08-06) Govan, PaulCorporate portfolio risk analysis is of primary concern for many organizations, as the success of strategic objectives greatly depends on an accurate risk assessment. Current risk analysis methods typically involve statistical models of risk with varying levels of complexity. Though, as risk events are often rare, sufficient data is often not available for statistical models. Other methods are the so-called expert models, which involve subjective estimates of risk based on experience and intuition. However, experience and intuition are often insufficient for expert models as well. Furthermore, neither of these approaches reflects the general information available on projects, both expert opinions and the observed data. The goal of this dissertation is to develop a general corporate portfolio risk analysis methodology that identifies theoretical causal relationships and integrates expert opinions with the observed data. The proposed conceptual framework takes a resource-based view, where risk is identified and measured in terms of the uncertainty associated with project resources. The methodological framework utilizes causal networks to model risk and the associated consequences. This research contributes to the field of risk analysis in two primary ways. First, this research introduces a new general theory of corporate portfolio risk analysis. This theoretical framework supports risk-based decision making whether through a formal analysis or heuristic measures. Second, this research applies the causal network methodology to the problem of project risk analysis. This methodological framework provides the ability to model risk events throughout the project life-cycle. Furthermore, this framework identifies risk-based dependencies given varying levels of information, and promotes organizational learning by identifying which project information is more or less valuable to the organization.Item Collaboration within the sport-based youth development non-profit network in Austin, TX(2016-05) Klaic, Darija; Dixon, Marlene A., 1970-; Sparvero, Emily Suzanne, 1975-This qualitative study assessed collaboration within the sport-based youth development non-profit network in Austin, TX. Network, capital, resource sharing and collaboration theories were used as lenses for this research project. Qualitative methods applied were surveys and follow-up interviews. Surveys were sent to 13 identified non-profit organizations in Austin, TX that use sports programming for youth development in order to gain insight into their structure and organization, including collaboration and partnerships. Follow-up interviews were recorded, transcribed, coded and analyzed. Findings uncovered that there is no collaboration between the organizations participating in the study, but that their respective cross-sectoral collaboration networks are of vital importance to the organizations’ existence and programming. Recommendations were made on future collaborations within the network and possible benefits of forming a coalition were discussed.Item Large-scale network analytics(2011-08) Song, Han Hee, 1978-; Zhang, Yin, doctor of computer scienceScalable and accurate analysis of networks is essential to a wide variety of existing and emerging network systems. Specifically, network measurement and analysis helps to understand networks, improve existing services, and enable new data-mining applications. To support various services and applications in large-scale networks, network analytics must address the following challenges: (i) how to conduct scalable analysis in networks with a large number of nodes and links, (ii) how to flexibly accommodate various objectives from different administrative tasks, (iii) and how to cope with the dynamic changes in the networks. This dissertation presents novel path analysis schemes that effectively address the above challenges in analyzing pair-wise relationships among networked entities. In doing so, we make the following three major contributions to large-scale IP networks, social networks, and application service networks. For IP networks, we propose an accurate and flexible framework for path property monitoring. Analyzing the performance side of paths between pairs of nodes, our framework incorporates approaches that perform exact reconstruction of path properties as well as approximate reconstruction. Our framework is highly scalable to design measurement experiments that span thousands of routers and end hosts. It is also flexible to accommodate a variety of design requirements. For social networks, we present scalable and accurate graph embedding schemes. Aimed at analyzing the pair-wise relationships of social network users, we present three dimensionality reduction schemes leveraging matrix factorization, count-min sketch, and graph clustering paired with spectral graph embedding. As concrete applications showing the practical value of our schemes, we apply them to the important social analysis tasks of proximity estimation, missing link inference, and link prediction. The results clearly demonstrate the accuracy, scalability, and flexibility of our schemes for analyzing social networks with millions of nodes and tens of millions of links. For application service networks, we provide a proactive service quality assessment scheme. Analyzing the relationship between the satisfaction level of subscribers of an IPTV service and network performance indicators, our proposed scheme proactively (i.e., detect issues before IPTV subscribers complain) assesses user-perceived service quality using performance metrics collected from the network. From our evaluation using network data collected from a commercial IPTV service provider, we show that our scheme is able to predict 60% of the service problems that are complained by customers with only 0.1% of false positives.Item Learning to write in (networked) public: children and the delivery of writing online(2014-12) Roach, Audra Katherine; Bomer, Randy; Hoffman, Jim; Maloch, Beth; Schallert, Diane; Hodgson, JustinThis investigation explored how three children (together with parents) developed networked publics that were diverse, well-connected, and powerful in the world. It was framed in response to calls in the field to better understand the new literacies young writers develop online and outside of school, and to increase literacy educators’ attention to the role of public audiences in writing and how writing is circulated. Performative case study methodology, ethnographic methods, and digital methods were employed to track and describe the online networks of three children (ages 11-13). These focal children were actively involved with their parents in social media, and had developed widespread networks with shared interests in children’s books and book reviews (Case 1), baseball (Case 2), and helping the homeless (Case 3). The children’s online networks were conceptualized as networked publics, drawing on Warner’s (2002) notion of publics as ongoing discursive relations among strangers, and on Actor-Network Theory’s notion of networks as assemblages of diverse interests that mobilize toward a common goal (Callon, 1986) and that develop stability in relation to ongoing circulations of texts (Latour, 1986; Spinuzzi, 2008). Research questions were framed broadly around the rhetorical canon of delivery [now digital delivery (Porter, 2009)], and were concerned with how writers distributed texts online, how those texts circulated, how the networked publics become more stable and powerful, and what instabilities children and parents had to negotiate in order to accomplish all of this. Data sources included interviews with 15 children and 28 adults, and fieldnotes observations of approximately 1,700 screen-captured webpages and other online artifacts. Findings showed that the young writers and their parents initiated and sustained networked publics through distribution practices that were oriented toward building trust; their texts displayed: interest, appreciation, reliability, service, credibility, and responsiveness. Both grassroots and commercial entities circulated texts in these networks, as they were sources of the ongoing renewal these different groups all needed in order to thrive. Sources of instability included conflicts over standards of writing quality, matters of profit, and the constancy of the demand to generate new interest and writing online. Children and their parents responded to these instabilities by welcoming and negotiating heterogeneous perspectives and partnerships. Implications of the study call for further research and teaching about the art of networked public discourse and digital delivery.Item Low Latency Stochastic Filtering Software Firewall Architecture(2012-08-29) Ghoshal, PrithaFirewalls are an integral part of network security. They are pervasive throughout networks and can be found in mobile phones, workstations, servers, switches, routers, and standalone network devices. Their primary responsibility is to track and discard unauthorized network traffic, and may be implemented using costly special purpose hardware to flexible inexpensive software running on commodity hardware. The most basic action of a firewall is to match packets against a set of rules in an Access Control List (ACL) to determine whether they should be allowed or denied access to a network or resource. By design, traditional firewalls must sequentially search through the ACL table, leading to increasing latencies as the number of entries in the table increase. This is particularly true for software firewalls implemented in commodity server hardware. Reducing latency in software firewalls may enable them to replace hardware firewalls in certain applications. In this thesis, we propose a software firewall architecture which removes the sequential ACL lookup from the critical path and thus decreases the latency per packet in the common case. To accomplish this we implement a Bloom filter-based, stochastic pre-classification stage, enabling the bifurcation of the predicted good and predicted bad packet code paths, greatly improving performance. Our proposed architecture improves firewall performance 67% to 92% under anonymized trace based workloads from CAIDA servers. While our approach has the possibility of incorrectly classifying a small subset of bad packets as good, we show that these holes are neither predictable nor permanent, leading to a vanishingly small probability of firewall penetration.Item Microscale modeling of layered fibrous networks with applications to biomaterials for tissue engineering(2015-08) Carleton, James Brian; Rodin, G. J. (Gregory J.); Sacks, Michael S.; Gonzalez, Oscar; van de Geijn, Robert; Mear, MarkMany important biomaterials are composed of multiple layers of networked fibers. A prime example is in the field of tissue engineering, in which damaged or diseased native tissues are replaced by artificial tissues that are grown on fibrous polymer networks. For load bearing tissues, it is critical that the mechanical behavior of the engineered tissue be similar to the behavior of the native tissue that it will replace. In the case of soft tissues such as heart valves, the macroscale mechanical behavior is highly anisotropic and nonlinear. This behavior is a result of complex deformations of the collagen and elastin fibers that form the extracellular matrix (ECM). The microstructure of engineered tissues must be properly designed to reproduce this unique macroscopic behavior. While there is a growing interest in modeling and simulation of the mechanical response of this class of biomaterials, a theoretical foundation for such simulations has yet to be firmly established. This work introduces a method for modeling materials that have a layered, fibrous network microstructure. Methods for characterizing the complex network geometry are first established. Then an algorithm is developed for generating realistic network geometry that is a good representation of electrospun tissue scaffolds, which serve as the primary synthetic structure on which engineered tissues are grown. The level of fidelity to the real geometry is a significant improvement on previous representations. This improvement is important, since the scaffold geometry has a strong influence over the macroscopic mechanical behavior of the tissue, cell proliferation and attachment, nutrient and waste flows, and extracellular matrix (ECM) generation. Because of the importance of scaffolds in tissue formation and function, this work focuses on characterizing scaffold network geometry and elucidating the impact of geometry on macroscale mechanics. Simulation plays an important role in developing a detailed understanding of scaffold mechanics. In this work, Cosserat rod theory is used to model individual fibers, which are connected to form a network that is treated as a representative volume element (RVE) of the material. The continuum theory is the basis for a finite element discretization. The nonlinear equations are solved using Newton's method in a parallel implementation that is capable of accurately capturing the large, three-dimensional fiber rotations and large fiber stretches that result from the large macroscopic deformations experienced by these biomaterials in their natural environment. Comparisons of simulation results with existing analytical models of soft tissues show that these models can predict the behavior of scaffold networks with reasonable accuracy, despite the significant differences between soft tissue and scaffold network microstructural geometry. The simulations also reveal how macroscale loading is related to the microscale fiber deformations and the load distribution among the fibers. The effects of different characteristics of the microstructural geometry on macroscopic behavior are explored, and the implications for the design of scaffolds that produce the desired macroscopic behavior are discussed. Overall, the improved modeling of electrospun scaffolds presented in this work is an important step toward designing more functional engineered tissues.Item MobiShare : mobile computing with no strings attached(2013-12) Castillo, Jason Moses; Julien, ChristineIn today’s world, technology is growing at a fast rate compared at other times. Sales have increased in the smart phone market, which has created new opportunities in pervasive computing. In pervasive computing, nodes enter and leave a network at any time. Within the network, nodes can transfer data to other nodes. The information is not retained in any static location such as a server. The mobile infrastructure requires a way to handle all the information in a dynamic way. The use of a centralized server in a mobile environment creates deterioration in the performance of obtaining information. The main goal of this paper is to provide data persistence using a “substrate” that is inherently not persistent. The data will be stored within the network for availability to all users. Saving data within a network would provide a means to obtain any type of information without relying on the source of where the data came from in the network. Users would also be able to continue downloading where they left off when they return to the network. Consider an environment where people can share music or books. For example, say that John Doe was searching for a particular song to download and in the network Jane has the song that was requested. John decides to download the song without knowing that it is from Jane. Then John decides to leave the network and the download stops. Whenever John rejoins the network the download of his song will continue where he left off, and his ability to access the information will not depend whether or not Jane is present in the network. John may retrieve the file from any other user who has the exact same file. The requested information that the user queries in a search engine will be stored as a metadata within the network, either by other nodes or a temporary server. This allows data to be obtained without relying on the "main user" or creator of the data to be present in the network. The users would also be able to retrieve the data at multiple times.Item Network strengthening for policy influencing : a case study of Kenya’s Africa Adaptation Programme (AAP) of the United Nations Development Programme(2011-12) Nkaw, John; Weaver, Catherine, 1971-; Busby, JushuaAs researchers provide compelling evidence pointing to climate change, governments and civil society actors are getting stimulated to act and reverse the negative impacts of extreme climate change. The impact of climate change on Kenya is profound and staggering. It is estimated that Kenya’s landmass is 582,350 km2, of which only 17% is arable, with 83% consisting of semi-arid and arid land. Climate change and human activities are resulting in desertification and increasing total semi-arid and arid land. Researchers further estimate that 17% of Mombasa or 4600 hectares of the region’s land area will be submerged as a result of sea-level rise. This situation demands policy actions to combat the situation. As developing countries wade into combating climate change, the government of Kenya is implementing far reaching polices to fight climate change including its 2006 water quality regulation and 2009 regulation of wetlands, riverbanks, lakeshore and sea management regulations of 2009. In addition, development partners such as the UNDP and civil society actors working on climate change have played a critical role complementing government policy actions. Working through the Africa’s Adaptation Programme (AAP), civil society organizations (CSOs) are participating in agenda setting, and increasing awareness that promote climate change adaptation through civic engagement. Civic engagement serves as an important tool for nongovernmental organizations (NGOs) to promote a more effective response to the hazardous effects of extreme climate change. Despite this, researchers and policy analyst argue that civil societies work within the environmental sector is not based on rigorous research, their actions are uncoordinated, and outcomes are poorly communicated. As a focal point, this report examined how CSOs organize around key policy issues and work through the AAP to set the agenda and influence climate change policymaking in Kenya. The study is based largely on an evaluation of secondary data sources including websites, Programme documents and academic articles. I also benefited from a summer internship at UNDP offices in Nairobi in 2010. The study explored how AAP is professionalizing and how that increases its leverage and strengthens NGOs to actively participate in policy influencing. The study summarizes scattered pieces of information into one report to enhance the AAP’s database building efforts. Finally, this serves as resource for CSOs policy engagement in Kenya and beyond. Overall, the report reveals that the AAP is bridging ties between CSOs working within the climate change sector by bringing them under one umbrella. This social bonding behavior serves as social capital to influence policy. However to increase leverage for effective policy engagement, the AAP needs to incrementally apply rigorous evidenced based research to generate more compelling information that transforms policies. It further suggests commercializing clean energy technologies by charging affordable rates for deploying such infrastructure to households. Finally, using policy entrepreneurs can dramatically improve policy advocacy in Kenya.Item Process Optimization and Integration Strategies for Material Reclamation and Recovery(2012-07-16) Kheireddine, HousseinIndustrial facilities are characterized by the significant usage of natural resources and the massive discharge of waste materials. An effective strategy towards the sustainability of industrial processes is the conservation of natural resources through waste reclamation and recycles. Because of the numerous number of design alternatives, systematic procedures must be developed for the effective synthesis and screening of reclamation and recycle options. The objective of this work is to develop systematic and generally applicable procedures for the synthesis, design, and optimization of resource conservation networks. Focus is given to two important applications: material utilities (with water as an example) and spent products (with lube oil as an example). Traditionally, most of the previous research efforts in the area of designing direct-recycle water networks have considered the chemical composition as the basis for process constraints. However, there are many design problems that are not component-based; instead, they are property-based (e.g., pH, density, viscosity, chemical oxygen demand (COD), basic oxygen demand (BOD), toxicity). Additionally, thermal constraints (e.g., stream temperature) may be required to identify acceptable recycles. In this work, a novel approach is introduced to design material-utility (e.g., water) recycle networks that allows the simultaneous consideration of mass, thermal, and property constraints. Furthermore, the devised approach accounts for the heat of mixing and for the interdependence of properties. An optimization formulation is developed to embed all potential configurations of interest and to model the mass, thermal, and property characteristics of the targeted streams and units. Solution strategies are developed to identify stream allocation and targets for minimum fresh usage and waste discharge. A case study on water management is solved to illustrate the concept of the proposed approach and its computational aspects. Next, a systematic approach is developed for the selection of solvents, solvent blends, and system design in in extraction-based reclamation processes of spent lube oil Property-integration tools are employed for the systematic screening of solvents and solvent blends. The proposed approach identifies the main physical properties that influence solvent(s) performance in extracting additives and contaminants from used lubricating oils (i.e. solubility parameter (delta), viscosity (v), and vapor pressure (p)). The results of the theoretical approach are validated through comparison with experimental data for single solvents and for solvent blends. Next, an optimization formulation is developed and solved to identify system design and extraction solvent(s) by including techno-economic criteria. Two case studies are solved for identification of feasible blends and for the cost optimization of the system.Item Remote USB Ports(2013-10-01) Roshan, RakeshSimplicity, easy to install, plug & play, high bandwidth, low latency and source of power are features of USB devices. Due to these features, many sensors and actuators are manufactured with USB interfaces for use in industries. The sensors and actuators need to be installed in fields. A computer system with USB interfaces is required to be present at the location of USB device for its working. In industry, these sensors and actuators are scattered over a large geographical area. The computers connected to them expose a large attack surface. These computers can be consolidated using virtualization and networking to reduce the attack surface. In order to consolidate computers, we need solution to extend USB port over networks so that, a USB sensor or actuator placed in fields can be accessed by a system remotely and securely. In this thesis, we propose a remote USB port, which is an abstraction of a USB port. In the USB core driver of the server machine, with the hub information, port status of all the ports is stored in a port status table. On the client machine a virtual host driver is created to manage proxy USB ports. When a device is inserted or removed from the USB port on the server, the client gets notified and corresponding device driver is loaded or unloaded respectively. To secure URBs, URB headers are encrypted before sending them over networks. We have implemented our solution in the Linux 3.5 kernel. We tested our solution on two machines connected over a 100 Mbps network. Various different types of USB devices were connected in the server machine and tested from the client machine. We found our solution to be device, device driver and USB protocol independent and transparent to network and device failures.Item Resource-constrained, scalable learning(2015-08) Mitliagkas, Ioannis; Vishwanath, Sriram; Caramanis, Constantine; Dimakis, Alex; Sanghavi, Sujay; Ravikumar, PradeepOur unprecedented capacity for data generation and acquisition often reaches the limits of our data storage capabilities. Situations when data are generated faster or at a greater volume than can be stored demand a streaming approach. Memory is an even more valuable resource. Algorithms that use more memory than necessary can pose bottlenecks when processing high-dimensional data and the need for memory-efficient algorithms is especially stressed in the streaming setting. Finally, network along with storage, emerge as the critical bottlenecks in the context of distributed computation. These computational constraints spell out a demand for efficient tools that guarantee a solution in the face of limited resources, even when the data is very noisy or highly incomplete. For the first part of this dissertation, we present our work on streaming, memory-limited Principal Component Analysis (PCA). Therein, we give the first convergence guarantees for an algorithm that solves PCA in the single-pass streaming setting. Then, we discuss the distinct challenges that arise when the received samples are overwhelmingly incomplete and present an algorithm and analysis that deals with this issue. Finally, we give a set of extensive experiment results that showcase the practical merits of our algorithm over the state of the art. The need for heavy network communication arises as the bottleneck when dealing with cluster computation. In that paradigm, a set of worker nodes are connected over the network to produce a cluster with improved computational and storage capacities. This comes with an increased need for communication across the network. In the last part of this work, we consider the problem of PageRank on graph engines. Therein, we make changes to GraphLab, a state-of-the-art platform for distributed graph computation, in a way that leads to a 7x-10x speedup for certain PageRank approximation tasks. Accompanying analysis supports the behaviour we see in our experiments.Item Web news in China : a new hierarchy of centrality? : an analysis of the linking pattern of China’s online news network(2010-05) Chen, Xin, 1977 Aug. 2-; Lasorsa, Dominic L.; Poindexter, Paula M.; Straubhaar, Joseph; Chyi, Hsiang I.; Alves, Rosental C.The present study explored three questions: What is the linking pattern of China’s cyber news space? What are the factors that contribute to this pattern? And what is the distribution of links in real geographic places? The concept of the cyber news space refers to the globally connected networks of online news production. It is a tool to understand the spatial distribution of online news production and the map of the world as presented in the media. This study is a content analysis of news Webpages from China’s four leading commercial portals. It sampled about 900 news Webpages during the spring of 2008. China’s commercial portals are news aggregators and distributors. They are the gatekeepers of China’s cyber news space. On their news Webpages there is one hyperlink that leads to the original publisher of the story. These links provide a clue of how news organizations were connected online. The content analysis coded these links and other information such as media type, production sites and locations of stories. This study found that the there was a pattern of concentration in terms of the distribution of links among online news organizations. A multiple regression model was used to test the factors that may contribute to this pattern. It was found that geographic location of news organizations was such a factor. The more central a news organization was located, the more links it attracted from the portals. In addition, this study also analyzed the distribution of links among difference provinces (or province level administrations) of China. It found that Beijing, Chongqing, Guangdong, Jilin, and Shanghai are hubs, while more remote provinces, such as Xinjiang, and Guizhou were largely bypassed.