Journal of
Systemics, Cybernetics and Informatics

 ISSN: 1690-4524 (Online)    DOI: 10.54808/JSCI



A Systemic/Cybernetic Notion of Design
Nagib Callaos
Pages: 1-29
The purpose of writing this article is 1) to describe the notion1 of “design” in order to show that there is almost no thought-based activity that does not, explicitly or implicitly, contains designing activities, including academic and scientific ones, professional practice, managerial action, and, even, everyday occupations; 2) to briefly describe the cybernetic relationships between research and design, and 3) to identify the relationships of design with intention and action. Since almost all what we have done, and do, in this life are caused by intentions that usually are followed by actions, then implicitly or explicitly, designing process support thinking and doing, especially in those related to academic, scientific and professional activities. Since 1) “design” is usually related to Engineering and professional activities and 2) this article is written for a special issue on “Research and Desing”, we will be more frequently explicit with regards scientific/research activities and to the notion of Science, showing irrespective to Traditional Science or new approaches to Science based on the Copenhaguen Interpretation of Quantum Theory and Second Order Cybernetics and Second Order Systems Theory.

The Philosophy of Research
Jeremy Horne
Pages: 30-56
At either extreme, “research” ostensibly is finding new information just to know it or for some purpose. Indeed, schools are established for people to learn what already has been discovered or how to discover. Usually, matters end here. Left out of conversations about research are deeper meanings of words like “knowledge”, “education”, “bias”, and “objective”. Students rarely encounter the more sophisticated “epistemology”, “second order critique”, and “ethos”. Above all, the foundation of learning, the love of truth, rarely is touched, “educators” freely floating in the air just as confused as their students. This essay sets forth orderly thinking and development of research, starting with definitions, continuing with knowledge acquisition – context and problems, and ending with applying the lessons learned. Phenomena as data get transformed into information, information through epistemology (justified belief) becomes knowledge, and knowledge through ethos yields wisdom. Overcoming the bias problem is done through bootstrapping, identifying a reference frame. Against the background of knowledge types, epistemic (theory) and technic (praxis), emerge inductive (synthetic) and deductive (analytic) methods of establishing quality reference frames. Bringing it all together, we have a philosophy of research. We all as students with the core ethos of loving truth are ourselves processes embedded in dialectics, or unity of opposites that describes knowledge space, from the infinitesimal to infinity. As the etymology of “research” says, humanity is wandering in search of itself.

On Architecture: Complexity and Decline
Taha A. Al-Douri
Pages: 57-61
Architecture is a collaborative, trans-disciplinary undertaking, not unlike political practice. Architects following armies into conquered territories, turned visions of political order into architecture, the embodiment of order as varied as was the role of the architect throughout time. When Germany, pondered her image as an industrialized nation with philosophical gravitas, she turned to design recognizing the new means of mass production, mechanized yet not losing sight of the makings of good taste: craftsmanship, proportion, and the fitness of form to purpose, both emotional and utilitarian. The architect once again was leader even at times when building was at a lull as Bruno Taut wrote in a letter dated November 24, 1919 “Today there is almost nothing to build … it is a good thing that nothing is being built today. Things will have time to ripen, we shall gather our strength, and when building begins again we shall know our objectives and be strong enough to protect our movement against botching and degeneration.” These views were in their natural context of The Crystal Chain Letters, the correspondence of “Architectural Fantasies” by Bruno Taut and his circle that included Hermann Finsterlin, Max Taut, Walter Gropius, Hans and Wassili Luckhardt, and Hans Scharoun (Taut, 1985). Taut maintained the complexity of an organic union between building and architecture, essential and mystical as that between body and spirit.

The Humboldt Portal: Complexity and Interconnectedness
Detlev Doherr
Pages: 62-71
The Humboldt Portal has been designed and implemented as part of an ongoing research project to develop an information system on the Internet to share the documents and rare books of Alexander von Humboldt, a 19th century German scientist and explorer, who viewed the natural world holistically and described the harmony of nature among the diversity of the physical world. Even after more than two centuries he is admired for his ability to see the natural world and human nature in the context of a complex network of relationships. The design and implementation of the Humboldt Portal are also oriented to support further research on Humboldt’s intellectual perspective.

Although all of Humboldt's works can be found on the internet as digitized documents, the complexity and internal inter-connectivity of his vision of nature cannot be adequately represented only by digitized papers or scanned documents in digital libraries.

As a consequence a specific portal of the Humboldt's documents was developed, which extends the standards of digital libraries and offers a technical approach for the adequate presentation of highly interconnected data.

Due to the continuous scientific and literary research, new insights and requirements for the digital presentation of Humboldt documents are constantly emerging, so that this article only provides a summary of the concepts realized at now. Consequently, the design and implementation of the Humboldt Portal is both: a consequence of a continuing research project and oriented to support more research on Humboldt´s intellectual holistic perspective, which was an anticipation to the System Approach of the last Century.

The Influence of Tradition, Context, and Research in Doctoral Degree Design
Lorayne Robertson, Bill Muirhead
Pages: 72-82
This paper examines the design of a doctoral degree in a technology-enhanced learning environment (TELE) as a form of design research. Research, policy, practice and design are key elements that can be mutually supportive in degree development. They are also elements that can evolve during doctoral program development if there is an open stance toward innovation. Designing and developing a degree involves significant research into the types of teaching, learning and assessment that have been shown to benefit students in TELE practice. Designers of education programs draw on methodologies from research, design and practice, employing common descriptors that are meaningful to informed audiences across disciplines. While the stages of the doctoral journey are easily recognizable, a design-based research approach can be employed to include innovation and reflection within degree elements and during stages of decision-making. In many ways, design-based research and doctoral program development mirror qualitative research, which has broader, more exploratory approaches embedded in its design. A qualitative research question often involves investigation or exploration (as opposed to an hypothesis). By nature, qualitative research design is formative, iterative, reflective and responsive, as different elements of the research design impact other elements as a phenomenon is explored. Researchers immersed in a qualitative study, or developers immersed in the design of a doctoral degree both participate in design through reflective processes. All of these factors contribute toward the synergies between design-based research and the development of a TELE doctoral education degree.

Viability, Sustainability and Non-Requisite Variety
Leonardo Lavanderos, Abelardo Araya, Alejandro Malpartida
Pages: 83-96
In his work, Ashby demonstrated the importance of a certain quantitative relationship called the law of requisite variety. After finding this relationship, Ashby related it to Shannon's theorem on the amount of noise or error that could be eliminated through a correction channel. The native limitation in the law of requisite variety, like the Shannon equation, applied in thermodynamics, is that the relational nature of the organization is not considered. Thus, he formulated that only the variety can absorb variety. However, the previous statement is only valid when it is formulated in the domain of interactions, but it is not possible to sustain it when dealing with relationships; as is the case of human organizations. This work introduces the concept of non-requisite variety (NRV) of Viability defined within the model of Relational. Therefore, a methodology is introduced to "measure" the degree of waste that occurs within a network which deviates from its organizational identity due to poor communication. The history of Science allows us to see that we are talking about the "history" of the reduction of non-requisite variety because, in the process of generating value, this phenomenon not only produces value equal to 0 but also generates values less than 0. Consequently, non-requisite variety destroys variety.

Increase the Success of Governmental IT-Projects
Maurice Gaikema, Mark Donkersloot, Jim Johnson, Hans Mulder
Pages: 97-105
Large and grand IT projects seem to fail worldwide. Several studies researched the indicators of success and failure on a national level, but there is little international comparative research. What if governments could apply international lessons learned when starting their IT projects? We assessed the factors of success of 110 projects in the Netherlands by the project benchmark resolution method of the Standish Group and compared large governmental and small projects to an international sample of the Standish database.

This study supports the relation between the Dutch IT Projects and the international sample of the Standish Group, thus offering an opportunity to learn from international projects when applied to the large and grand governmental IT-projects and small IT-projects in the Netherlands. Acknowledging international post-mortem lessons learned could raise awareness of applying an international project resolution benchmark in the prenatal phase of IT-projects

The Interconnections of Research and Design in Context of Social Trust and the Triple Helix Concept
Annamaria Csiszer
Pages: 106-116
Both explicitly and implicitly the central role of my paper is given to the description of the methodology that I used while designing my research, to the discussion of the problem that I faced while putting my research results onto paper. What I am really interested in is describing the mental design process through real life research examples of social trust that show the importance of this phenomena, that with its own dynamics, structures, systems and subsystems construct and maintain the functioning of the science-economics-governance triangle. To what extent can trust in neighbours, strangers or social institutions affect our social well-being? Is the digital communication of social trust capable of solving social and economic problems? Can trust function as a social connective tissue? How can we reveal these problems with the help of social sciences and how can we facilitate social trust with the help of social communication researches focusing especially on the interconnections between science-economics and governance. While trying to answer these questions I attempted to function as a problem solving agent. I designed abstract and concrete research categories in the framework of discourse analysis that functioned as tools to my research. Defining trust as social capital provides an opportunity to review national and international researches, which make it possible to survey the effects of social trust in a computer mediated context.

Problems During Scientific Research and Designing Integration
Ingus Mitrofanovs, Marita Cekule, Kaspars Cabs
Pages: 117-128
This paper deals with experience and problems about research, project development with industry for IT solution design development. It is well known that design plays an increasingly important role in our daily lives. The University of Latvia and Joint Stock Company “Latvia’s State Forests” within the framework of an effective collaboration projects program are developing joint research project. Partner was interested in development of an automatic volume measurement of logs and wood chip loads on trucks. To solve this problem, a methodology and a technological solution are needed to allow remotely perform volumetric surveys and monitoring. It is significant with the intense development of forest and logging industry. The system consists of measurement arch with video cameras and IT solution - video processing and analysis software, a graphical user interface, data communication channels and storage systems. When designing the timber assortment measurement line in field conditions, the specifics of its operation and high-performance requirements should be considered. Therefore, a laboratory prototype was made to evaluate arch size and size of its field of operation, the best cameras and light layouts. Was studied human experience in timber assortment manual measuring for better understanding IT solution design development. Graphical user interface design is developed taking into account the specific requirements of the partner - the integration of the indicators for the determination of the quality, as well as the easy to operate functionality in the geometric measurement process.

Current State and Modeling of Research Topics in Cybersecurity and Data Science
Tamir Bechor, Bill Jung
Pages: 129-156
Arguably, the two domains closely related to information technology recently gaining the most attention are ‘cybersecurity’ and ‘data science’. Yet, the intersection of both domains often faces the conundrum of discussions intermingled with ill-understood concepts and terminologies. A topic model is desired to illuminate significant concepts and terminologies, straddling in cybersecurity and data science. Also, the hope exists to knowledge-discover under-researched topics and concepts, yet deserving more attention for the intersection crossing both domains. Motivated by these, then retaining most of the already accepted IMCIC (the International Multi-Conference on Complexity, Informatics, and Cybernetics) 2019 conference paper’s content and supplementing it with implicit design activities while conducting the research, this study attempts to take on a challenge to model cybersecurity and data science topics clustered with significant concepts and terminologies, grounded on a textmining approach based on the recent scholarly articles published between 2012 and 2018. As the means to the end of modeling topic clusters, the research is approached with a text-mining technique, comprised of key-phrases extraction, topic modeling, and visualization. The trained LDA Model in the research analyzed and generated significant terms from the text-corpus from 48 articles and found that six latent topic clusters comprised the key terms. Afterwards, the researchers labeled the six topic clusters for future cybersecurity and data science researchers as follows: Advanced/Unseen Attack Detection, Contextual Cybersecurity, Cybersecurity Applied Domain, Data-Driven Adversary, Power System in Cybersecurity, and Vulnerability Management. The subsequent qualitative evaluation of the articles found the LDA Model supplied the six topic clusters in unveiling latent concepts and terminologies in cybersecurity and data science to enlighten both domains. The main contribution of this research is the identification of key concepts in the topic clusters and text-mining key-phrases from the recent scholarly articles focusing on cybersecurity and data science. By undertaking this research, this study aims to advance the fields of cybersecurity and data science. Besides the main contribution, the additional research contributions are as follows: First, the topic modeling approached using text-mining makes the cybersecurity domain unearth the terminologies that make IST (Information Systems and Technology) researchers investigate further. Secondly, using the result of the study’s analysis, IST researchers can decide terms of interest and further investigate the articles that supplied the terms.

Transforming Cybersecurity Education Through Consulting
Giti Javidi, Ehsan Sheybani
Pages: 157-168
Cyber Security is a total business problem. Creating meaningful learning experiences to prepare students for complex cyber situations in an organization is no small undertaking. The higher education is ill prepared in meeting the challenge of providing learning experiences that provide both breadth and depth. To fill the void and meet the fast-growing demands in cybersecurity workforce, the authors of the paper propose consulting experience as a strategy to create student-driven learning. Student-led consulting will assist students in systematically assessing organizational security problems and designing and implementing organizational interventions. We describe why college students should be equipped to conduct consulting projects with faculty supervision and how it can be done.

Designing for Learning in an Interdisciplinary Education Context
Lillian Buus, Jette A. Frydendahl, Thomas W. Jensen, Thue F. Jensen, Kirstine B. Lillelund, Mette Falbe-Hansen
Pages: 169-185
This article presents research done in an interdisciplinary educational context at VIA University College in Denmark. The research is based on a learning design approach for the design of digitally mediated learning to be integrated in educators’ teaching practice. Action learning and action research sustain the learning design methodological approach. The authors have focused on the factors that influence the educator’s and the developer’s engagement and creativity in participating in a learning design process and development of digitally mediated learning materials.

The authors will reflect upon the interdisciplinary interactions that materialize in/engender a joint research agenda, where researchers from different domains collaborate with educators and developers across VIA. Furthermore, reflections are offered on what has been learnt during the research process in relation to investigating the learning design process in an interdisciplinary educational context. Insights into which factors need attention in general and more importantly in an interdisciplinary context have been drawn from the research and the reflections on this are made.

Analysis of Information in the Academic Management of the UNED, Required in the Self-Assessment Processes and the Relation Between Research and Design of the Investigation
Ariana Acón-Matamoros, Aurora Trujillo-Cotera
Pages: 186-196
One of the institutional concerns of the Distance State University (UNED) to obtain appropriate information to sustain continuous improvements in the Diplomaed and Bachelor’s degree and postgraduate program’s to reach academic excellence. Consequently, one of the challenges faced by higher education, institutional policies and regulations is the consolidation and strengthening of the management of academic and other dependencies because of the culture of quality and excellence, and the ever-increasing demands of society. The Self-evaluation department of the Academic Quality Management Institute is responsible for contributing to compliance of these policies. Therefore, it proposes an information system, which should solve the needs that are not yet covered. Moreover, the information system should be accessed by academic dependencies, administrative clerks, university executives and by the IGESCA coworkers, in order to enhance evaluation, self-evaluation, certification, accreditation, re-accreditation processes of the University. The results of this investigation were achieved from the applied surveys and the interviews to the users, consisting with the information analysis. This paper specifies the importance and necessity for the information system, which will support the self-evaluation processes of the University. On the other hand describe the research design for this paper, in order to support academic management and decision making.

Designing a Supply System for a Productive Company
Javier Chávez González, Graciela Vázquez Álvarez, Efraín J. Martínez Ortíz, Sandra D. Orantes Jiménez
Pages: 197-212
This paper presents the design of a Supply System for a productive company, with the purpose of updating the term of Supply Chain (SC), which has evolved over time to mix with some other elements which conform production and finished products warehousing or distribution; has derived in the concept named Supply Chain Management (SCM) and has mixed in such a way that is confused, with a management system. The proposal designs a specific system to manage just supplies not finished products, and it is been visualized agree with the image of the imaginary lines that seems to be drawn by the incomes of the supplies which is different form of the known chain that is applicable not to all type of productive companies, and in a more precise size of a management system even if the system is closer to a subsystem of all one. As we considered this a soft system, we worked with Jenkins Methodology. To realize a new concept of a Supply System through breaking up the known SCM system to identify characteristics of the different subsystems and analyzing the main aspects in order to determine the parameters of an updated and appropriate system for a productive company we determined that the Supply Pony Tail (SPT) System as more appropriate supply system.

A Novel Interactive Network Fuzzer for System Security Assessment
Jaime C. Acosta, Christian Murga, Alberto Morales, Caesar Zapata
Pages: 213-220
Network security testing can be done at different levels of fidelity. This can involve simply scanning a network to identify any open ports for services and versions of services, to uncovering novel vulnerabilities in proprietary or undocumented services. The granularity of such an analysis depends not only on time and cost, but also on the availability of client software that can be used to interact with the different services. Complexity increases when the underlying protocol is undocumented or nontrivial. In this case, testers must first understand the protocols, and then develop software that can be used to interact; past the common handshake or initial connection behavior to uncover vulnerabilities. In this paper, we present an architecture that marries protocol reverse engineering and network fuzzing through a graphical interface. We have developed a proof of concept (PoC) that is capable of intercepting packets between source and destination nodes; allowing analysts to use the interface to interactively or pseudo-interactively (using hooks) observe, modify, drop, and/or forward the traffic during security tests. We designed our experimentation methodology with two perspectives in mind: blue-teaming (cooperative grey/white box) and red-teaming (non-cooperative, black box). We report performance of our PoC with the Transport Control Protocol.

Designing for Proactive Network Configuration Analysis
Magreth Mushi, Rudra Dutta
Pages: 221-239
Human operators are an important aspect of any computing infrastructure; however, human errors in configuring systems pose reliability and security risks, which are increasingly serious as such systems grow more complex. Numerous studies have shown that errors by human administrators have contributed significantly to misconfigurations of networks. The research community has reacted with development of solutions that largely directed at detecting and correcting misconfigurations statically, after they have been introduced into the configuration files. This is done either by checking against known good configuration practices or by data mining configuration files. Though to some extent such approaches are useful, they are in fact “treatments” rather than “preventions”. Automated tools that abstract complex sets of network administration tasks have also been seen as a potential solution. On the other hand, such tools simply remove the possibility of human error one step, to the development of the workflow, and can have the effect of magnifying the risk of such mistakes due to their speed of operation. There is a need for a proactive solution that examine consequences of a proposed configuration before it is implemented.

In this paper, we describe the research design towards developing a proactive solution for misconfiguration problem. Then present the design and implementation for SanityChecker-an SDN-based solution for intercepting incoming configurations and inspecting them for human errors before committing to the devices. SanityChecker was tested by real-world network administrators and the results show that it can successfully improve network operations by overseen incoming configuration for human errors.

Research Design for Evaluating the Impact in SMES Related to the Technological Means Imposed by the Mexican Tax Authorities
Jesús Vivanco, Ma. del Carmen Martinez
Pages: 240-248
The Purpose of this paper is to DESIGN a research (WHAT) , for identifying the impact (WHY), on SMEs due to the Tax provisions established by Mexican Tax Authorities, related to electronic invoicing and electronic accounting that could generate a change in SMEs administration culture, since they had to learn how to handle the official programs established by the Mexican Authorities, and through a research based on a documental analysis of official data, (HOW) we can see, that the level of electronic invoicing has been increasing since 2011 to 2014 in an 799% what means that SMEs owners have been improving their knowledge on Tics handling, complying with the authorities requirements and enhancing the SMEs performance.

Evaluation by Competences in a Clinical Environment of a Public University in Peru (Invited Paper)
Maritza Placencia Medina, Javier Silva Valencia, Elías J. Carrasco Escobedo, Marissa Muñoz-Ayala, Jorge R. Carreño Escobedo, Carlos Saavedra Castillo, Yanelli K. Ascacivar Placencia
Pages: 249-259
The evaluation of medical students in the clinical setting (outside the classroom) is a great challenge due to the fact that the learning process becomes more complex. There is little scientific literature in which the research ends in an action to design efficient forms of evaluation. We aimed to design and validate an instrument to reach an evaluation by competencies in the Course: Introduction to Clinical Medicine at the National University of San Marcos (UNMSM) in Lima-Peru. This publication follows a research-action methodology, where the initial results led to the design of an evaluation methodology in clinical environments that is then re-evaluated again to determine if it really manages to evaluate the comprehensive skills required in a medical student. Results: 14 professors were interviewed. In the clinical environment, theoretical lectures and planned didactic sessions are used based on clinical cases. In clinical practice, the priority is given to clinical thinking, and problem-based learning (PBL). The reseach team in conjunction with the professors started the evaluation by competences process developing an evaluation instrument for the specific clinical practice. The participants observed resistance to the change because of certain administrative barriers y poor institutional support. The critical point in this investigation was the training in evaluation and learning methodologies. A training plan was required before starting their teaching activities. The professors agreed with the new form of evaluation and recognized the value of the teaching service with responsible and ethical dedication.

Development of the Software Cryptographic Service Provider on the Basis of National Standards
Rakhmatillo Djuraevich Aloev, Mirkhon Mukhammadovich Nurullaev
Pages: 260-272
The article provides a brief description of the cryptography service provider software developed by the authors of this article, which is designed to create encryption keys, private and public keys of electronic digital signature, create and confirm authenticity of digital signature, hashing, encrypting and simulating data using the algorithms described in State Standards Of Uzbekistan. It can be used in telecommunications networks, public information systems, government corporate information systems by embedding into application applications that store, process and transmit information that does not contain information related to state secrets, as well as in the exchange of information and ensuring the legal significance of electronic documents.

The cryptography service provider includes the following functional components: a dynamically loadable library that implements a biophysical random number sensor; dynamic library that implements cryptographic algorithms in accordance with the State Standards of Uzbekistan; module supporting the work with external devices; installation module that provides the installation of cryptography service provider in the appropriate environment of operation (environment).

On the Calculation of Entropy of EEG Transients
Carlos A. Ramírez-Fuentes, Blanca Tovar-Corona, V. Barrera-Figueroa, M. A. Silva-Ramírez, L. I. Garay-Jiménez
Pages: 273-286
In order to characterize abnormal behaviours related to epileptic seizures and other neurological disorders, in this paper it is described a methodology and fundaments to find an optimum tolerance criterion r to estimate the entropy of abnormal transients with a duration of two seconds from electroencephalograms (EEG) recordings of patients diagnosed with partial epilepsy, attention deficit/hyperactivity disorder (ADHD), and conduct disorder (CD). The considered EEG signals come from patients with different ages diagnosed with these cerebral disorders in order to investigate the possibility of identifying each of them using approximate entropy (ApEn) and sampling entropy (SampEn) of the transients, which lead to the determination of an appropriate value for the parameter r. With the present approach it is possible to obtain an accurate identification of abnormal transients in EEG signals of short duration and reliable estimations of their entropies.

The Legitimization of Improvement Science in Academe
Casey D. Cobb, Patricia Virella
Pages: 287-296
This article examines improvement science (Bryk, 2009) against the backdrop of traditional academic research methods. Improvement science is perhaps most closely aligned with design-based implementation research, and is typically applied to networked communities (e.g., schools, hospitals) with the goal of continuous organizational improvement. Improvement science has earned value among the practitioners and researchers who engage in it, but still seeks more complete legitimacy within the academy. We describe the method of improvement science and situate it within the two paradigms of design research and research design. Examples of its implementation in school reform and university program improvement are shared to illuminate the systematic and dynamic nature of its process. The article speaks to the normalization of research design and design research within the context of "what counts" as research in academia, and where and how improvement science can fit within these traditions.

Seminars in Proactive Artificial Intelligence for Cybersecurity (SPAIC): Consulting and Research
Ehsan Sheybani, Giti Javidi
Pages: 297-305
The authors have designed a platform for research and consulting through a high-level collaborative seminar series to promote networking in proactive artificial intelligence (AI) for cybersecurity (SPAIC). The primary objective is to cover a wide range of techniques in cyber threat intelligence gathering from various social media to dark-net and deep-net, hacker forum discussions, and malicious hacking. The secondary objective is to bring together researchers and consultants in the field to come up with automated and advanced methods of attack vector recognition and isolation using AI and machine learning (ML). In most cases, the hidden nature of security issues makes it hard for fixes in real time. Advanced AI techniques have proven to be superior to the current static methods in cyber threat detection. There have been numerous recent advances in the field of AI, especially in algorithmic approaches such as Speech and Signal Processing, Machine and Deep Learning, Computer Vision, Robotics, Data Mining, Augmented/Virtual Reality, Blockchain, and Cognitive Computing. These highly advanced methods provide tremendous opportunities for behavior/trend based automated analysis, detection, and prevention of cyber attacks/threats. In addition to the potential of development of concepts and whitepapers for a large-scale center, the seminar series will result in identification and recruitment of industrial, academic and/or government partnerships in support of initiatives and research and consulting collaborations as well as creation and support of resources such as research consortia, collaboration sites or social networking tools to facilitate large-scale inter-university research programs in AI and ML in cybersecurity.