|Efficient Local Search in Imaging Optimization Problems with a Hybrid Evolutionary Algorithm|
The paper focuses on the efficiency of local search in a Hybrid evolutionary algorithm (HEA), with application to optimization problem frequently encountered in electronic imaging. Although HEA can significantly improve the overall performance of evolutionary search, the direct usage of methods of local optimization gives rise to a few performance problems including a noticeable additional cost of fitness evaluations attributed to local search; an excessive waste of computational resources on a particular chromosome that is later discarded by the global search; and a possible convergence to a sub-optimal solution when the actual distance from the global optimum is not sufficiently small for the local search to successfully descend to the minimum point. Computational performance of local search can be potentially improved by applying the following techniques: using direct search that can better accommodate shape irregularities of fitness function; adding randomness and periodically re-positioning the search, thus preventing it from converging to a sub-optimal point; creating a tree-like structure for each local neighborhood that keeps track of the explored search space; using cyclic vs. complete local search, thus cutting down the excessive cost attributed to discarded chromosomes; incorporating image response analysis and providing the algorithm with a means of deriving problem-specific knowledge that speeds up the solution. A two-phase cyclic local search is proposed that incorporates these techniques. A series of computational experiments with 2-dimensional grayscale images provide experimental support for the proposed approach and show that computational performance of local search in imaging optimization with HEA can be significantly improved.
Efficient Video Streaming Scheme for Next Generations of Mobile Networks
Majdi Ashibani, Fathi Ben Shatwan
Video streaming over next generations of mobile networks has undergone enormous development recently due to the continuing growth in wireless communication, especially since the emergence of 3G wireless networks. The new generations of wireless networks pose many challenges, including supporting quality of service over wireless communication links. This is due to the time-varying characteristics of wireless channel. Therefore, a more flexible and efficient bandwidth allocation scheme is needed. This paper is a part of ongoing work to come up with a more robust scheme that is capable of rapidly adapting to changes in network conditions. The proposed scheme focuses on the wireless part of the network, providing high quality video service and better utilization of network resources.
Engineering Emergence for Cluster Configuration
Distributed applications are being deployed on ever-increasing scale and with ever-increasing functionality. Due to the accompanying increase in behavioural complexity, self-management abilities, such as self-healing, have become core requirements. A key challenge is the smooth embedding of such functionality into our systems.
Natural distributed systems such as ant colonies have evolved highly efficient behaviour. These emergent systems achieve high scalability through the use of low complexity communication strategies and are highly robust through large-scale replication of simple, anonymous entities. Ways to engineer this fundamentally non-deterministic behaviour for use in distributed applications are being explored.
An emergent, dynamic, cluster management scheme, which forms part of a hierarchical resource management architecture, is presented. Natural biological systems, which embed self-healing behaviour at several levels, have influenced the architecture. The resulting system is a simple, lightweight and highly robust platform on which autonomic applications can be deployed.
Exploring the Knowledge Management Index as a Performance Diagnostic Tool
Jakov Crnkovic, Salvatore Belardo, Derek A. Asoh
The knowledge management index (KMI) has been proposed as a parsimonious and useful tool to help organizations gauge their knowledge management (KM) capabilities. This may be the first step in understanding the difference between what an organization is currently doing and what it needs to do in order to maintain and improve its performance level. At the macro level, the index enables organizations to compare themselves with each other. At the micro level, it calls attention to areas needing improvement in current and future KM initiatives. In either case, the KMI provides a robust indicator and basis for business decision-making and organizational support and development.
This paper presents a holistic approach to KM that relates key knowledge management processes (KMP) and critical success factors (CSF) needed to successfully implement it. By juxtaposing these processes and success factors, we create Belardo’s matrix that will enable us to characterize an organization and estimate the KMI.
At the macro level, we used realized KMI values and OP estimates to confirm the positive correlation between the KMI and OP. Additional findings include comparing the current and expected role of KM in organizations and discussion for marginal values of rows (CSF) and columns (KM Processes) of the proposed matrix.
Fuzzy Optimization and Normal Simulation for Solving Fuzzy Web Queuing System Problems
Xidong Zheng, Kevin Reilly, James Buckley
In this paper, we use both fuzzy optimization and normal simulation methods to solve fuzzy web planning model problems, which are queuing system problems for designing web servers. We apply fuzzy probabilities to the queuing system models with customers arrival rate l and servers?service rate m, and then compute fuzzy system performance variables, including Utilization, Number (of requests) in the System, Throughput, and Response Time. For the fuzzy optimization method, we apply two-step calculation, first use fuzzy calculation to get the maximum and minimum values of fuzzy steady state probabilities, and then we compute the fuzzy system performance variables. For the simulation method, we use one-step normal queuing theory to simulate the whole system performance and its variables. We deal with queuing systems with a single server and multiple servers?cases, and compare the results of these two cases, giving a mathematical explanation of the difference.
Keywords: Fuzzy Optimization, Normal Simulation, Queuing Theory, Web Planning Model.
Harnessing the Power of Scientific Data Warehouses
Data warehousing architecture should generally protect the confidentiality of data before it can be published, provide sufficient granularity to enable scientists to variously manipulate data, support robust metadata services, and define standardized spatial components. Data can then be transformed into information that would make them readily available in a common format that is easily accessible, fast, and bridges the islands of dispersed information. The benefits of the warehouse can be further enhanced by adding a spatial component so that the data can be brought to life, overlapping layers of information in a format that is easily grasped by management, enabling them to tease out trends in their areas of expertise.
Interoperability of Geographic Information: A Communication Process –Based Prototype
Since 1990, municipal, state/provincial, and federal governments have developed numerous geographic databases over the years to fulfill organizations’ specific needs. As such, same real world topographic phenomena have been abstracted differently, for instance vegetation (surface), trees (surface), wooded area (line), wooded area (point and line), milieu boisé (surface), zone boisée (unknown geometry). Today, information about these geographic phenomena is accessible on the Internet from Web infrastructures specially developed to simplify their access. Early in the nineties, the development of interoperability of geographic information has been undertaken to solve syntactic, structural, and semantic heterogeneities as well as spatial and temporal heterogeneities to facilitate sharing and integration of such data. Recently, we have proposed a new conceptual framework for interoperability of geographic information based on the human communication process, cognitive science, and ontology, and introduced geosemantic proximity, a reasoning methodology to qualify dynamically the semantic similarity between geographic abstractions. This framework could be of interest to other disciplines. This paper presents the details of our framework for interoperability of geographic information as well as a prototype.
Mathematical Physics Framework SustainingNatural Anticipation and Selection of Attention
An ambient intelligent environment is definitely a prerequisite for anticipating the needs and catching the attention of systems. But how to endow such an environment with natural anticipatory and attentive features is still a hardly ever properly addressed question. Before providing a roadmap towards such an ambient intelligent environment we first give cognitive-ergonomic accounts for how natural anticipation and selection of attention (NASA) emerge in living organisms. In particular, we describe why, when and how exploratory and goal-directed acts by living organisms are controlled while optimizing their changing and limited structural and functional capabilities of multimodal sensor, cognitive and actuator systems. Next, we describe how NASA can be embedded and embodied in sustainable intelligent multimodal systems (SIMS). Such systems allow an ambient intelligent environment to (self-) interact taking its contexts into account. In addition, collective intelligent agents (CIA) distribute, store, extend, maintain, optimize, diversify and sustain the NASA embedded and embodied in the ambient intelligent environment. Finally, we present the basic ingredients of a mathematical-physical framework for empirically modeling and sustaining NASA within SIMS by CIA in an ambient intelligent environment. An environment which is modeled this way, robustly and reliably over time aligns multi-sensor detection and fusion; multimodal fusion, dialogue planning and fission; multi actuator fission, rendering and presentation schemes. NASA residing in such an environment are then active within every phase of perception-decision-action cycles, and are gauged and renormalized to its physics. After determining and assessing across several evolutionary dynamic scales appropriate fitness, utility and measures, NASA can be realized by reinforcement learning and self-organization.
|Mixture Segmentation of Multispectral MR Brain Images for Multiple Sclerosis|
Lihong Li, Xinzhou Wei, Xiang Li, Syed Rizvi, Zhengrong Liang
We present a fully automatic mixture model-based tissue classification of multispectral (T1- and T2-weighted) magnetic resonance (MR) brain images. Unlike the conventional hard classification with a unique label for each voxel, our method models a mixture to estimate the partial volumes (PV) of multiple tissue types within a voxel. A new Markov random field (MRF) model is proposed to reflect the spatial information of tissue mixtures. A mixture classification algorithm is performed by the maximum a posterior (MAP) criterion, where the expectation maximization (EM) algorithm is utilized to estimate model parameters. The algorithm interleaves segmentation with parameter estimation and improves classification in an iterative manner. The presented method is evaluated by clinical MR image datasets for quantification of brain volumes and multiple sclerosis (MS).
Multicriteria Optimization of Gasification Operational Parameters Using a Pareto Genetic Algorithm
Miguel Caldas, Luisa Caldas, Viriato Semiao
Gasification is a well-known technology that allows for a combustible gas to be obtained from a carbonaceous fuel by a partial oxidation process (POX). The resulting gas (synthesis gas or syngas) can be used either as a fuel or as a feedstock for chemical production. Recently, gasification has also received a great deal of attention concerning power production possibilities through IGCC process (Integrated Gasification Combined Cycle), which is currently the most environmentally friendly and efficient method for the production of electricity. Gasification allows for low grade fuels, or dirty fuels, to be used in an environmental acceptable way. Amongst these fuels are wastes from the petrochemical and other industries, which vary in composition from shipment to shipment, and from lot to lot. If operating conditions are kept constant this could result in lose of efficiency. This paper presents an application of Genetic Algorithms to optimize the operating parameters of a gasifier processing a given fuel, so that the system achieves maximum efficiency for each particular fuel composition. A Pareto multiobjective optimization method, combined with a Genetic Algorithm, is applied to the simultaneous maximization of two different objective functions: Cold Gas Efficiency and Hydrogen Contents of the syngas. Results show that the optimization method developed is fast and simple enough to be used for on-line adjustment of the gasification operating parameters for each fuel composition and aim of gasification, thus improving overall performance of the industrial process.
On Eigenfunction Based Spatial Analysis for Outlier Detection in High-Dimensional Datasets
This paper is concerned with two methods, one based on eigenvalue analysis, and the other, a modified version of singular value decomposition (SVD) called pseudo-SVD, for detecting outliers in high-dimensional data sets. The eigenvalue analysis approach examines the spatial relationship among the column vectors of object-attribute matrix to obtain an insight into the degree of inconsistency in a cluster of data. The pseudo-SVD method, in which the singular values are allowed to have a sign, looks at the direction of vectors in the object-attribute matrix and based on the degree of their orthogonality detects the outliers. The pseudo-SVD algorithm is formulated as an optimisation problem for clustering the data on the basis of their angular inclination. The methods have been applied to two case studies: one pertaining to a dermatological dataset and the other related to an engineering problem of state estimation. Further research directions are also discussed.
Patient Safety Technology Gap: Minimizing Errors in Healthcare through Technology Innovation
In a world of ever increasing technological advances, users of technology are at risk for exceeding human memory limitations. A gap analysis was conducted through reviewing literature in the field of human error or specifically transition errors in emergency room (ER) operations to identify the current state of technology available. The gap analysis revealed the technological needs of ER healthcare workers. The findings indicate the need for technology such as knowledge management or decision support systems in ERs to reduce the potential for error, enhance patient safety, and improve the overall quality of care for the patient.
Personalization by Relevance Ranking Feedback in Impression-based Retrieval for Multimedia Database
Tsuyoshi TAKAYAMA, Hirotaka SASAKI, Shigeyuki KURODA
This paper proposes an approach to personalization by relevance `ranking’ feedback in impression-based retrieval for a multimedia database. Impression-based retrieval is a kind of ambiguous retrieval, and it enables a database user to find not only a known data but also an unknown data to him/her. Conventional approaches using relevance feedback technique only return a binary information: `relevant’ or `not relevant’, for his/her retrieval intention. In this paper, he/she returns each relevance ranking to his/her retrieval intention for top n data of a retrieval result. From this feedback information, an adjustment data inherent to him/her is produced, and utilized for personalization. We show its effectiveness by an evaluation using our pilot system.
Realization of Personalized Services for Intelligent Residential Space based on User Identification Method using Sequential Walking Footprints
Jin-Woo Jung, Dae-Jin Kim, Z. Zenn Bien
A new human-friendly assistive home environment, Intelligent Sweet Home (ISH), developed at KAIST, Korea for testing advanced concepts for independent living of the elderly/the physically handicapped. The concept of ISH is to consider the home itself as an intelligent robot. ISH always checks the intention or health status of the resident. Therefore, ISH can do actively the most proper services considering the resident¡¯s life-style by the detected intention or emergency information. But, when there are more than two residents, ISH cannot consider the residents¡¯ characteristics or tastes if ISH cannot identify who he/she is before.
To realize a personalized service system in the intelligent residential space like ISH, we deal with a human-friendly user identification method for ubiquitous computing environment, specially focused on dynamic human footprint recognition. And then, we address some case studies of personalized services that have been experienced by Human-friendly Welfare Robot System research center, KAIST.
Sensors in Distributed Mixed Reality Environments
Felix Hamza-Lup, Charles Hughes, Jannick Rolland
A distributed mixed-reality (MR) or virtual reality (VR) environment implies the cooperative engagement of a set of software and hardware resources. With the advances in sensors and computer networks we have seen an increase in the number of potential MR/VR applications that require large amounts of information from the real world collected through sensors (e.g. position and orientation tracking sensors). These sensors collect data from the real environment in real-time at different locations and a distributed environment connecting them must assure data distribution among collaborative sites at interactive speeds. With the advances in sensor technology, we envision that in future systems a significant amount of data will be collected from sensors and devices attached to the participating nodes This paper proposes a new architecture for sensor based interactive distributed MR/VR environments that falls in-between the atomistic peer-to-peer model and the traditional client-server model. Each node is autonomous and fully manages its resources and connectivity. The dynamic behavior of the nodes is dictated by the human participants that manipulate the sensors attached to these nodes.
Introducing Handheld Computing for Interactive Medical Education
Joseph Finkelstein, Ashish Joshi, Mohit Arora
The goals of this project were: (1) development of an interactive multimedia medical education tool (CO-ED) utilizing modern features of handheld computing (PDA) and major constructs of adult learning theories, and (2) pilot testing of the computer-assisted education in residents and clinicians. Comparison of the knowledge scores using paired t-test demonstrated statistically significant increase in subject knowledge (p<0.01) after using CO-ED. Attitudinal surveys were analyzed by total score (TS) calculation represented as a percentage of a maximal possible score. The mean TS was 74.5±7.1%. None of the subjects (N=10) had TS less than 65% and in half of the subjects (N=5) TS was higher than 75%. Analysis of the semi-structured in-depth interviews showed strong support of the study subjects in using PDA as an educational tool, and high acceptance of CO-ED user interface. We concluded that PDA have a significant potential as a tool for clinician education.